Across the years, I have invented, filed for, and received grants of a number of patents. Initially in the location-based marketing space (developed long before smartphones had GPS), extending to bar/QR-code render optimization, and most passionately, in the interactive / motion tracking space. Here’s a summary of a few of the key patents that I’ve been granted by the USPTO (Gregory Roberts Patents):
WonderPixel
Apparatus and method for creating a crowd-based visual display with pixels that move independently
This invention is specifically targeted to major stadium events (Olympic Opening Ceremony, Superbowl, World Cup), and more generally in support of any arbitrary dynamic matrix of moving / swarming pixels (think drone shows). The key idea is to have a high-performance sensing mesh and lightweight broadcast system (in one embodiment, a scanning infrared laser) that allows any arbitrary array of controllable light sources (think: handheld LED wands, smartphone screens, drone emitters) to display a full-color still image, video stream, or even holographic artifacts.
Patent: US 8,049,688
Filed / Issued : 2006.07.07 / 2011.11.01
Inventors: Greg Roberts, Matthew Flagg, Suzanne Roshto, Young Ki YU
Google Patents | Justia Law | Download PDF : US8049688
Markerless Motion Capture
Interactive imaging systems and methods for motion control by users
This invention, and its continuations, sit at the essence of the PlayMotion System which Matt, Suzanne, Jeremy and I pioneered from 2003-2009. The basic premise was a projector/camera system (thank you Raskar, Fuchs, Utterback), where the actual lightspeed optical shadow of the player(s) within the playspace served as the avatar. Optical shadows hold massive advantages over digitally rendered avatars: they are literal zero-latency (speed of light), they are infinite resolution (far greater than 4k), and they are monochrome (pure black), allowing the player to use their imagination and environmental context to map whatever character / archetypes they want upon the image.
Patent: US 9,674,514
Filed / Issued : 2006.04.13 / 2017.06.06
Inventors: Greg Roberts, Matthew Flagg
Google Patents | Justia Law | Download PDF : US9674514
Virtual Avatar / Agentic Interactions
System and method for enabling meaningful interaction with video based characters and objects
This patent pre-sages much of the work done recently (2022 and beyond) on LLM-based multi-modal vision models, for instance GPT4+V. We present a highly streamlined machine learning method which reduces binary silouhettes of actual human motion to both horizontal and vertical histograms, then uses those histograms to heuristically reconstruct virtual avatar sequences. This is analogous to Tesla FSD v12, which instead of relying on hand-coded AI driving algorithms, simply trained on the massive existing corpus of recorded human driving (billions of hours from 7 cameras per Tesla), thus extracting the core human driving experience. But we did this 12 years ago. 😉
US Patent: 8,538,153
Filed / Issued : 2011.09.23 / 2013.09.17
Inventor: Greg Roberts, Matthew Flagg
Google Patents | Justia Law | Download PDF
Auto-Calibration for VR Environments
System and associated methods of calibration and use for an interactive imaging environment
A major challenge in human-scale interactions — whether at theme parks or at in-home VR environments — is calibrating the equipment to the physical space — including walls, furniture, props and obstacles. There is the added issue of “homography” — that is, correcting skewed camera / sensor fields of view (FoVs) with the geometric axis (NSEW / up / down / left / right) of both the user and the room. Herein we present automated methods for sensing and calibrating both of these elements. The most practical implementation of these techniques are embodied in the automatic “geometry correction” functions available in most home theatre projectors., which will paint a perfect rectangular screen on the wall regardless of projector skew / tilt / orientation.
US Patent: 8,867,835
Filed / Issued : 2011.09.23 / 2014.10.21
Inventor: Greg Roberts, Matthew Flagg
Google Patents | Justia Law | Download PDF
Location-Based Smartphone Ad Targeting
Method and system for electronic delivery of incentive information based on user proximity
Long before GPS was standard in smartphones, we developed this method of triangulating user position (lat/lon geo-location) via closest cell-towers / time-of-flight, and then dynamically targeting ads and promotions to them based on what vendors / brick and mortar retail stores were nearby / along their route of travel. This invention presaged both the iPhone and the broader geo-targeted mobile ad market by many years.
US Patent: 8,554,611
Filed / Issued : 2004.09.10 / 2013.10.08
Inventors: Greg Roberts, Scott Wills
Google Patents | Justia Law | Download PDF
Profile-Based Digital Marketing
This patent presaged the entire industry of profile based ad targeting. ‘nuf said.
US Patent: 8,620,732
Filed / Issued : 2004.09.10 / 2013.12.31
Inventors: Greg Roberts, Scott Wills
Google Patents | Justia Law | Download PDF
Device-Independent Optimized BarCode Rendering
System and method for bar code rendering and recognition
This was my first patent, and it solved the problem of rendering clear, accurate, and fully scannable bar-codes across any device (printer, phone screen, data projector). regardless of device resolution. This is a foundational technology that later enabled such things as Apple Wallet and phone-to-phone QR codes.
US Patent: 6,882,442
Filed / Issued : 2002.12.09 / 2005.04.19
Inventor: Greg Roberts
Google Patents | Justia Law | Download PDF