CNet reports on John Deere’s invention of an autonomous robot tractor you can control with your smartphone:
“It takes a while to get comfortable because … first of all, you’re just kind of amazed just watching it,” said [Minnessota farmer Doug] Nimz, who on a windy October afternoon described himself as “very, very interested” but also a “little suspicious” of autonomous technology before using John Deere’s machine on his farm. “When I actually saw it drive … I said, ‘Well, goll, this is really going to happen. This really will work.'”
Rather than creating a brand-new machine, the company unveiled equipment that can be added to its popular 8R 410 tractors for full autonomy. Two boxes — one on the front and the other in the back — contain a total of 12 stereo cameras and an Nvidia GPU that let a farmer control the machine from a smartphone, starting it with a swipe of a button and watching live video as the machine moves across a field.
John Deere’s tractors have been capable of steering themselves for two decades — as along as the farmer still sits behind the wheel. That fact makes the move to a fully autonomous tractor less of a stretch for a farmer than for someone who’s spent the last 20 years driving a Honda Civic.
The plan this year is to let a limited number of farmers use the autonomous system. During the initial rollout, Deere will rent a full tractor and chisel plow to about 10 to 50 producers who have steady internet connectivity on their farms and have an interest in using the technology.
Later on, Deere will let farmers bring their own tractors to be retrofitted with the autonomous technology.
Most autonomous cars being tested use a depth sensor called lidar, while Tesla employs an array of cameras, sensors and radar. John Deere, however, believes that stereo cameras are the way to get self-driving technology into as many fields as possible. Its autonomous machine has 12 such cameras.
To make the 8R autonomous, John Deere mounts a stereo camera pod on the front of the tractor and another pod on the back. Each pod has three pairs of ruggedized stereo cameras that essentially work like human eyes. Images are collected by both and are then combined to help the machine locate potential obstacles that are between 45 feet and 90 feet away.