Workshop on Light Field Imaging
February 12, 2015
MacKenzie Conference Room, Huang Engineering Center, Stanford University
The Workshop on Light Field Imaging will include a summary of the state-of-the-art research and a glimpse into the future of technologies designed to capture and create light rays in a three dimensional scene. Participants will leave with a better understanding of the concept of a light field as it is used in geometric optics, computer vision, computer graphics and computational photography. The Workshop will include talks that summarize recent advances in light field cameras and light field displays, as well as applications of these technologies in entertainment, consumer devices, industrial applications and medical imaging. The Workshop will also include an interactive session with experts from industry and academics addressing questions about the killer applications and challenges in product development, new areas for research and graduate training, and the future of light field imaging. There will also a technology demo session that will include presentations by research labs and startup companies.
Click here for photos
Click here for videos
Special thanks to Light Field Interactive and our sponsors:
Light Field Imaging - Program
|8:00 - 8:45 a.m.||Registration and Continental Breakfast|
|8:45 - 9:00 a.m.||Welcome and Opening Remarks||Dave Singhal - Light Field Interactive
Joyce Farrell - Stanford Center for Image Systems Engineering
|9:00 - 9:30 a.m.||Light Field Capture||Dr. Kurt Akeley Lytro, USA|
|9:30 - 10:00 a.m.||Light Field Displays||Prof. Gordon Wetzstein
Stanford University, USA
|10:00 - 10:30 a.m.||Light, Digitizing, and Displaying People with Light Fields||Prof. Paul Debevec
University of Southern California, USA
|10:30 - 10:50 a.m.||Coffee Break|
|10:50 - 11:20 a.m.||Light Field imaging and integral imaging - Applications for depth capture and display||Prof. Byoungho Lee
Seoul National University, Korea
|11:20 - 11:50 p.m.||The progress of light field real 3D display||Prof. Liu Xu
Zhejiang University, China
|11:50 - 12:20 p.m.||Near-to-eye light field displays for augmented reality||Prof. Hong Hua
University of Arizona, USA
|12:20 - 1:30 p.m.||Lunch Break|
|1:30 - 5:00 p.m.||Interactive Demo Session||Produced by Lily Achatz,
Light Field Interactive
|5:30 - 7:00 p.m.||Panel Discussion |
•Killer applications and challenges
•New areas for research and graduate students training
•The future of light field imaging
|Introduction by Layla Mah, AMD
Moderator: Prof. Bernd Girod, Stanford
•Dr. Ren Ng, Lytro
•Dr. Douglas Lanman, Oculus
Dr. Ozan Cakmakci, Google
Dr. David Luebke, NVIDIA
Light Field Imaging - Interactive Demo Session
|A 360-degree levitating light field display that provides significant improvements in light field reconstruction efficiency, full-parallax adaptive rendering complexity, and user-friendly interaction: We present a prototype that uses a flat-plate deflected diffuser screen and a high-speed projector to create a 360-degree floating scene in the air. A panoramic annular lens is used to omni-directionally track users’ faces and provide the motion vertical parallax in real-time rendering. A Leap Motion controller tracks all 10 of the human fingers simultaneously and enhances the natural interaction experience|
|Wide Field of View Augmented Reality Eyeglasses using Defocused Point Light Sources
We present a novel design for an optical see-through augmented reality display that offers a wide field of view and supports a compact form factor approaching ordinary eyeglasses. Instead of conventional optics, our design uses only two simple hardware components: an LCD panel and an array of point light sources (implemented as an edge-lit, etched acrylic sheet) placed directly in front of the eye, out of focus.
|Panoptic Camera - A 360-degree field-of-view (FOV) multi-camera platform. The Panoptic camera is an omnidirectional imaging system capable of reconstructing full FOV panorama in real-time, displaying it on a client PC/tablet, stream it online, or display on a virtual reality head-mounted display. We will present a real-time operation of a miniature prototype consisting of 15 image sensors connected to Oculus Rift. In addition, we will demonstrate a telepresence capability, by connecting to our lab in Lausanne, Switzerland, where the output of a larger, higher-resolution prototype is streamed over a web server.|
|Augmented Reality for Museums on Epson Moverio
Part of a greater project investigating the uses of augmented reality in a museum environment, we will demonstrate our prototype on AR glasses (Epson Moverio BT-200). This system recognizes an artwork from a database stored on device, and augments the view in real-time with a customized label providing more information about the object.
|Near Eye LIght Field Displays that enable thin, lightweight head-mounted displays (HMDs) capable of presenting nearly correct convergence, accommodation, binocular disparity, and retinal defocus depth cues.|
|Oculus Crescent Bay Prototype:Crescent Bay is the latest prototype headset on the path to the consumer version of the Rift. Crescent Bay features new display technology, 360° head tracking, expanded positional tracking volume, dramatically improved weight and ergonomics, and high-quality integrated audio.The upcoming Oculus Audio SDK uses Head-Related Transfer Function (HRTF) technology in conjunction with the Rift’s head tracking to achieve a sense of true 3D audio spatialization. Along with the new hardware, we’ve created original demo content, which we’re calling the “Crescent Bay Experiences,” developed in-house by our content team specifically for Oculus Connect. The demo is designed to demonstrate the power of presence and give you a glimpse into the level of VR experience you can expect to see come to life in gaming, film, and beyond.|
|OTOY will demonstrate a groundbreaking immersive light field experience in virtual reality. The demo brings to life via interactive holographic video the Batcave from the acclaimed Emmy Award–winning Batman: The Animated Series. The interactive narrative experience will give viewers the opportunity to explore Batman’s world, allowing them to feel what it is like to be inside the show’s stylized universe on devices such as the Oculus Rift.|
|Lytro Development Kit (LDK): The LDK provides imaging researchers with the highest degree of control of Lytro’s advanced light field capture devices and processing software engine and paves the path for deeper partnerships with technical R&D teams and enterprises in new undiscovered scientific territories.|
|Intel RealSense Technology
Intel(R) RealSense(tm) technology lets you interact with your devices more like you interact with people - with natural movements. The technology is powered by the real-time depth-sensing Intel RealSense 3D camera, available today in select AIO and notebook devices. Pairing 3D input with a floating 3D display, Intel is designing prototype systems that enable interactive mid-air interfaces.
|TruLife Optics designs and manufactures transparent full color holographic wave-guided optics for AR, Eye Tracking using IR, and floating wave-guided holograms for computers and tablets. We will demonstrate our optics as well as a floating keypad hologram paired with a Leap Motion controller.|
|Pelican will demonstrate depth-enhanced video and still image capture with a super-thin array camera built into a Qualcomm reference design tablet. Along with an accurate depth map generated in real time, we’ll show photos with motion parallax and a range of post-capture image edits such as refocus, matting, background substitution, and distance measurement.|
|While big format VR and small-screen 'glanceables' have successfully launched the wearables industry, Innovega's unique eyeborne optics platform "cracks the code" and uniquely delivers any digital media (panoramic, HD, 3D, transparent) from a lightweight fashion glasses.|
|Transmissive Head Mounted Display for Virtual, Augmented and Collaborative Reality Applications with Optical Tracking for 3D Positioning|
|Occipital develops state-of-the-art computer vision hardware and software. Their most recent product, the Structure Sensor, is the first 3D sensor for mobile devices|
|CastAR is a magical experience that allows for groups of people to see and interact with 3D objects that spring from your coffee table, workstation, wall, or other objects.|
|Realtime TheatriX creates immersive 360-degree, horizon to sky, 3D, interactive social entertainment "blended reality" experiences for 15 to 20 simultaneous participants using Technical Illusions' CastAR technology.|
|Augmented reality is set to evolve. Coming soon to Kickstarter, Seebright has developed wide field of view display technology for mobile virtual and augmented reality. Seebright is now welcoming pioneering developers to build innovative experiences for AR and VR on one platform ranging from information displays to interactive games and academic and professional applications.
Seebright will be demonstrating their latest prototype and mixed reality experiences.
| SMI Eye Tracking Glasses: Glasses-based 60 Hz system used for real-world interactions, sports, kinesiology, hand-eye coordination, driving, biomechanics, rehabilitation, etc. Small, light, and designed for maximal peripheral and binocular view, with wireless recording and remote annotation in real-time. Highly-robust technology with +100,000 participant’s recorded.
SMI Eye Tracking for Oculus DK2: Binocular, 60 Hz eye tracking integrated into the Oculus DK2 HMD. Includes an SDK for real-time streaming of eye tracking data with support for VR engine integration. Based on popular ETG platform and used for fully immersive visual perception analysis in VR environment.
|LEAP Motion will demonstrate the Arm HUD Widget - a completely customizable heads-up display that sits on the top of a user's forearm inside Virtual Reality spaces.|