Stored big-data based “learning” for semantic extraction • Pattern recognition • Semantic/ language processing Camera / image sensor (= eye / retina) Physical sensing input Algorithms software to perceive 3D and kinesthetic senses, as the complementary pair to Artificial Intelligence Artificial Perception (= right brain) Real-time sensing “instinct” for geo- metric acquisition • Space recognition • Sense of control/ motion
perceive and interpret 3D and kinesthetic senses from all the related sensor inputs 3 Sense Sensor Vision Depth Inertia § Camera § ToF § LiDAR § IMU § Gyroscope Mechanical odometry § Motor § Actuator Position § Beacon § GPS Kudan’s capabilities (AP) § Spatial mapping/ reconstruction § Position tracking/ odometry § 3D recognition § Orientation/ posture tracking § Scale detection § Azimuth/ vertical detection Sophisticated algorithms to perceive rich sensor data Advanced interfaces to interpret and integrate simple sensor data Rich data Simple data
2D images from multiple view points along the moves of the image sensor § Real-time 6DoF trajectory of the view point § Real-time 3D map of the feature points Kudan’s SLAM (Simultaneous Localization and Mapping) enables real-time 3D mapping and position tracking
provided as well as a modular framework for use in other solutions, to be embedded on chipset to versatilely support future devices 6 Versatile Modularized Practicable • Portable to any processing architecture • Flexible to camera setup and peripherals • Configurable to required use cases • A stack of 50+ modular frameworks (e.g. point matching, image blurring) as plug-in for other solutions • Fast and constant consumption • Accurate mapping and tracking • Robust in unpredictable movements and re- localization
technology which is applicable to required hardware setups and use cases 7 Features Setup examples that KudanSLAM is applicable Flexibility to camera Integration with sensors Capability for cross-hardware § Camera configuration: mono, stereo, multiple § Shutter: rolling, global § Lens: fisheye, omnidirectional § Tracking sensor: IMU, GPS, Beacon § Depth sensor: RGB-D, ToF, LiDAR § Mapping and Localisation across different hardware: cameras, lenses, depth sensors § Allocation of computing and data flow: local, edge, cloud Portability across platform § OS: Android, iOS, Linux, OSX, Windows and any other required § Processing architecture: CPU (ARM/Intel), GPU, DSP and any other required Configurability to use cases § Performance: accuracy, speed, robustness, data size § Output: position (localization/re-localization), point cloud density (sparse/dense)
Mobile Wearable Drone Robotics Automotive CCTV Service Hardware/ system on chip IP Algorithm architecture Processing architecture § S/W Libraries § H/W IPs § S/W Libraries § H/W IPs Future vision Kudan architecture of Artificial Perception algorithms will be the fundamental technology platform, which is cross-device at the bottom of the industry to support all the vision related solutions