Concepedia

Publication | Open Access

Gibson Env: Real-World Perception for Embodied Agents

56

Citations

0

References

2018

Year

TLDR

Developing visual perception models for active agents in the physical world is hampered by slow algorithms and fragile, costly robots, prompting a shift to learning‑in‑simulation that raises questions about real‑world transferability. This work investigates real‑world perception for active agents by proposing the Gibson Environment and demonstrating a suite of perceptual tasks learned within it. Gibson virtualizes over 1,400 real‑world floor spaces from 572 buildings, incorporates an internal “Goggles” synthesis module that eliminates domain adaptation, and embeds agents under realistic physics constraints.

Abstract

Developing visual perception models for active agents and sensorimotor control in the physical world are cumbersome as existing algorithms are too slow to efficiently learn in real-time and robots are fragile and costly. This has given rise to learning-in-simulation which consequently casts a question on whether the results transfer to real-world. In this paper, we investigate developing real-world perception for active agents, propose Gibson Environment for this purpose, and showcase a set of perceptual tasks learned therein. Gibson is based upon virtualizing real spaces, rather than artificially designed ones, and currently includes over 1400 floor spaces from 572 full buildings. The main characteristics of Gibson are: I. being from the real-world and reflecting its semantic complexity, II. having an internal synthesis mechanism "Goggles" enabling deploying the trained models in real-world without needing domain adaptation, III. embodiment of agents and making them subject to constraints of physics and space.