Follow the Agriculture Technology Podcast on Apple Podcasts, Spotify and YouTube.
You can find past podcast episodes and view show notes by visiting our podcast website.
Have precision ag questions? We have the answers. Find a specific channel dedicated to answering your precision technology questions: Precision Ag Answers.
Read the entire transcript from the latest episode.
Tony Kramer: Hi, I'm Tony Kramer, your host of the Agriculture Technology Podcast. I'm sitting down with agriculture technology and equipment experts to help you enhance your operation for today, tomorrow, and into the future. In this episode, we hear from John Deere's Senior Vice President and Chief Technology Officer, Jahmy Hindman, live from the Grand Farm Innovation Campus. Jahmy talks the future of farming, new capabilities, from See & Spray to autonomy, and gives us the 10,000-foot perspective where technology in agriculture is only going to continue to increase. Please enjoy, Jahmy Hindman, live from the Grand Farm Innovation Campus.
Jahmy Hindman: I would start out by saying thank you to the growers, the farmers. Very much appreciate what you do and how you do it, not just for your business, but for what it means for all of us. It's a noble occupation. We're really proud to be able to help in that and to serve in the Ag industry. I think we ought to answer, why are we interested in autonomy? The answer is pretty simple. There's 8 billion souls on the planet today, and we're headed towards a number that is quite a bit north of that, 10 billion by 2050. Sometimes I get challenged on this statistic, are you sure it's going to be 10 billion by 2050? The answer is no. I don't know if it's going to be exactly 10 billion by 2050, but I don't think you can refute what's happening with respect to the world population when you look at it historically.
This is 1950 to 2023, and I would argue the trend since the 1970s, 1975, has been pretty consistent, and it's pretty easy to sort of predict where that goes. Whether it's 9.5 billion or whether it's 10 billion, I don't really know. The reality is it's a lot more than are on the planet today. Not only is it more people, but our diets are changing, quality of life fundamentally is improving across the globe. Food has probably never been more abundant than it is today in the history of humanity, and that's a good thing. People are generally living longer lifespans, and their nutrition is significantly better. Not only do we have to meet the need of an additional 2 billion people on the planet, but we have to meet the need of additional and higher quality of life for those people as well. That's the challenge, I think, that's in front of us.
If you look at the history of agriculture, I would argue we only get the opportunity to talk about technology today because of what agriculture has done for the world in the last, call it 100 years. This is the share of the population in various countries-- the top one's India, Brazil, Mexico. I think I had to pick France because I didn't have German data. The bottom one's the United States. The share of the population of those countries that's involved in agriculture as a function of time since 1840. I picked that number very intentionally because that's an important number for John Deere, but 1840 on to modern times, and you can see about 2% of the population of the United States today is involved in agriculture, directly in agriculture. That's incredibly low when you think about all of the people that consume the food that 2% of the population of this country produces.
What it's done for this country, and I would argue pretty typically across the globe in most of the developed countries, and increasingly so in the non-developed countries, is it's allowed people like me to no longer work on the farm, but to work for a company like John Deere, and to work on technology solutions that, in my case, help make farming better, I hope. But in many cases, they have gone on to do lots of other things; put people on the moon, put satellites in orbit, create computers, create Facebook. I don't know if that's a good thing or not, you guys tell me. My point is, fundamentally, that this transformation in agriculture has released human potential to go do things that otherwise they wouldn't have been able to do, and I think that's an incredibly powerful thing, incredibly-- I get a little emotional about this.
It is a significant opportunity, I think, in front of us to continue that trend, and it's a significant responsibility for those of us that work proximate to the industry to continue to improve it for the betterment of all of us. That's the story of agriculture, that's why this is important. But it does lead to this significant question, if only 2% of the population is directly involved in agriculture, how do we continue to get better and more efficient? Inevitably, I think you have to ask the question around labor, where is the labor going to come from in order to continue to do the jobs that we do from an agricultural perspective? That is, I think, getting more difficult.
There are cases potentially where it's not from an isolated geography case, but in general, in the state of North Dakota, this is the trend of people moving from rural environments into urban environments. I would tell you, you only have to look at the west side of Fargo to recognize this is happening, right? These are people that are moving from smaller communities often. My home community in Maxwell, Iowa has seen this. People moving out of those small communities into larger cities, and they're no longer as proximate to where the farming activities are happening, so it's creating this stress in the system if you want to think about it that way. Where do I go to get the labor in order to do farming in the future? I think technology is not the complete answer to that question, but I do think it plays a role in helping us think about how we might begin to solve that dilemma.
I think you then have to ask the question of, why does all this technology matter now? Why are we having this conversation to begin with? There are a handful of significant technology trends that I think are radically different today than they were 10 or 15 years ago that sort of give us the opportunity to start to think about why now this is all happening. Why now can you start to think about autonomy in agriculture? Why now can you think about something as highly automated as See & Spray? I wanted to just give you a picture, hopefully in a relatively high-level view of what some of those trends are that are really driving the opportunity that we see to benefit industries like agriculture.
I would start with-- this is not necessarily new. We've been on a very long trajectory from 1837 to today of transforming products, of taking advantage of things like internal combustion engines with the Waterloo Gas Engine Company in 1918. There are moments in the history of agriculture that have been significant shifts, significant changes in the way that we did things. I think we're in the midst of one of those today, just as much as we were in 1918, transferring from animal power on the farm to what became internal combustion engines, and eventually tractors.
A little bit a sense of sort of how things have changed in the last 10 years or so relative to-- some might argue product complexity. I would also argue product complexity and capability. This is a picture, a row crop planter behind an 8R tractor, what it would have looked like in 2009. We would have written 4 million lines of software on that product. We would have been using a compact flashcard to take data on and off the device, or on and off the tractor. We had roughly 25 controllers that were all running on a 16-bit processor. We had three major control systems, one GP unit on the machine, and a couple hundred sensors. If you think about where things are at today, that same product goes a lot faster than it did in the previous slide. Because of that, and because of a lot of other things that have changed with that product, it's become significantly more complex.
Now, 250 controllers plus, depending upon how many row units you might have in the planter, 20 million lines of code, we're now connected both terrestrial cellular connections and satellite connections with that machine. No longer do you have just the compact flashcard. 10 major control systems. We put GPS on both the tractor and the planter so that we can make sure the planter is where it's supposed to be, not just the tractor. Gigabit Ethernet on that machine because we're moving a whole lot of data around, and we're doing it really fast. The amount of sensing that we do on that machine has increased substantially as well.
If you think about where we've come from a software development perspective, this is just the trend over the course of time from 2008 to 2019, how many lines of code go into those products, this tractor and this planter in this case. And importantly, for some of you in the room, how many of those were written by hand versus how many of those are written by a computer in the form of a model-based software delivery type of system. This is all building up to the idea that technology is woven through these machines, it has been for a very long time, and it's continuing at a fairly accelerated pace. Along with that comes this need for it to continue to develop accelerating value on the product.
This chart represents-- who's familiar with graphical GPUs, graphical processing units? A few of you. GPUs are a processing device that has the ability to process a lot of problems in parallel. That's where they fit in life, if you want to think about it that way. They are the bread and butter of a company called NVIDIA, as one of the providers. They were used originally to make video games work. They were the processing engines for the graphics in video games, believe it or not. That's how NVIDIA got its start in life. We use them today to do the processing of the images that come off the cameras on the autonomous tractor and the cameras that sit on the See & Spray boom. They're really good at parallel processing, which is what you need from modern artificial intelligence algorithms.
This curve is just a demonstrator to show you the capability of those devices with respect to time. It's growing exponentially. What does that mean to agriculture? What does it mean to the equipment? It means that the ability for this machine to go faster than 12 or 14 or 15 miles an hour is going to happen. It's going to happen in a relatively short period of time, because one of the limiting factors for that today is compute. We need to be able to compute all these images as quickly as possible so we know where to spray, and that determines how fast we can drive the sprayer. If you put a higher performance GPU on that product, you now have the ability to drive it faster.
We talked about how small a weed See & Spray could see, a quarter inch, remember that? Better GPUs, faster GPUs give you the ability to run higher resolution cameras. Higher resolution cameras give you the ability to see smaller weeds. This is important because this is the brains behind everything that we do in this era of high automation and autonomy. It's getting to the point where the capability of these GPUs-- This is a dangerous analogy, and if you use it outside of this setting, I'll deny I ever said it. We're getting to the point where the processing capability in a GPU is not terribly different from how we think of the processing capability of your human brain. We wouldn't probably have said that 10 years ago. Today, you have the opportunity to at least start to pull that apart a little bit and play with it.
Even though it's not really comparable and not really true, we're getting to the point where the processing capability on these products is becoming so significant that it's really not a limiting factor or won't be a limiting factor for much of what we want to do. And historically, it always has been. These algorithms that we're talking about, the machine learning algorithms that are being used in the autonomous machine and in the See & Spray depend upon data. Data is the fuel for these algorithms. We train these algorithms on immense amounts of information in the form of images. One of the things that is necessary and helpful for us to be able to do that at scale is connectivity on and off-board the machines.
This chart is just to show you the number of John Deere connected machines per day that call in and register and produce useful information within things like John Deere Operations Center. You can see the North American growing season in this data if you've been able to suss it out. Our data tends to spike or peak in spring planting and in fall harvest, and that's the reason the data looks the way it does. The more important point here is there's 600,000 machines that have the ability to start to create data for applications we haven't even thought of.
If we know what those applications are, we can start to harness the power of that connectivity and start to pull data out of the entire fleet to help train these models to produce some value that we haven't even contemplated today. That's a super compelling, I think, case from a John Deere perspective for this case of connectivity on pieces of equipment: The opportunity to start to understand how is the equipment being used, what's it being subjected to, and how can we make it better, and how can we produce better solutions as a consequence. This chart is just-- I'll call it megabytes, to not confuse anybody, per connected machine per day. This gives you a sense for how much information is coming off of a piece of our equipment in any given day. This doesn't even include-- today, it doesn't include the image information that would come off of a See & Spray unit or a tractor, an autonomous tractor.
Somebody asked the question I think earlier in the panel about, what do you do with all the image data, the image information that comes off the sprayer? The answer today is, we don't do anything with it. We don't actually keep it. It's terabytes of data. Once it's processed, it's done. It disappears into the ether and it never comes off the machine. But if you had that high resolution imagery, and you could collect it and you could use it as high-res imagery on the farm, you would probably find value in that at some point. And so this starts to be able to pull open this idea of, "Hey, I have a connected machine, I have the ability to pull information off of it. What other stuff is interesting to me?" That's an interesting space for us to explore as we move forward.
Not just terrestrial connectivity anymore, so the cost of-- If you've noticed, on the top of that tractor, there's a Starlink terminal. One of the fundamental shifts that's happened, I would argue, in the last decade is the cost of-- the space industry would think about the cost associated with their products in space to be largely dominated by the cost of launch. It takes a lot of energy to get something from the surface of the planet into orbit. Historically, they measured that cost of launch as thousands of dollars per kilogram of payload. Whatever you want to put into space, whatever the satellite might be, if it weighs a kilogram, historically that might have cost you $5,000, $6,000, $7,000. So, 2.2 pounds would have cost you $5,000, $6,000, $7,000 to put into orbit.
What SpaceX has done in the last decade is nothing short of miraculous. They've taken that cost structure from thousands of dollars for a couple of pounds of payload down to the neighborhood of $500, and they're headed towards $10. If you think about that, it's going to create a proliferation of things in space. That's a problem that humans will have to deal with, but it is going to create an opportunity for us to put more things in orbit that can do other things or maybe better things than they do today. You think about this from an agricultural perspective and things like Earth observation, satellite imagery when you want it, where you want it, as you want it, much deeper than it is today.
We think about it in terms of connectivity. We want to be able to connect any machine anywhere, anytime, doesn't matter where they're at on the planet. Satellite connectivity has really opened up the opportunity for us to think about doing that, but it's not the only way that this lower cost of launch is going to transform our industries. This one gets a little bit technical, and I apologize-- I didn't know how to convert it into something that was less technical. This is the AI space. If you're familiar with artificial intelligence, these are the algorithms and the history of algorithms that have changed over the course of really the last 25 years or so. They've gotten increasingly more complex, they've gotten increasingly more capable, they've also gotten increasingly more burdensome from a compute perspective.
Remember that GPU chart that I showed you before? One of the reasons that that is important to know about is, the algorithms that are running the things that-- the models that are running in the compute devices that are taking these images from these two machines and telling us what's in them, and in the case of the autonomous machine, how far away are those things from me? That takes a lot of compute to do. We've seen a significant transformation in the model on the algorithm side of this as well. Currently, this machine is running a transformer network for object identification in the compute, and this machine is running a convolutional neural network in it. That one used to run a convolutional neural network until about a year and a half ago, and then we switched it because we got a lot better performance out of the more modern architecture in that machine.
This is going to continue to change. The current transformer networks, these are-- If you think about the large language models like ChatGPT, they're built on these transformer networks over on the far-right side of the slide. We don't really know what the natural limit of their capabilities are. The bigger the models get, the more data that they're trained on, the more capable they seem to become. We don't understand what the limitations are of them yet, full stop, in terms of their capabilities, so that's an interesting opportunity in front of us. I think it begs the question of, what are these really hard problems in agriculture that we don't have the answers to from a human intellect perspective, that we could somehow start to understand what the answers might be from tools like this?
I'll give you an example. Operations Center, many of you use Operations Center today. We use it on our home farm too. If I want to go find out the last 10 years of yield data on a particular field, I can go do that. It might take me 15 minutes in Operations Center to figure it out. If I then want to know, what did I apply for nutrients in that field to produce that outcome, I can go find that out too. It's probably going to take me 15 minutes to figure it out. If I want to know what varieties I grew on that field that produced those outcomes, I can do that too. It's probably going to take me 15 minutes to figure out. As a consequence, I don't really do it, because I just didn't have 45 minutes in my day to go answer that question and start to suss out cause and effect relationships for things, so I probably called an agronomist and had them figure it out for me.
You can think of a network on this right side of the slide that's trained in that Operations Center data for the farm, that has the ability for you to just ask the question of it. What was my yield production for this field over the last 10 years? And it gives you the answer. Think of it like a search engine on steroids. It's not really telling you anything you couldn't go figure out for yourself, but it's going to tell you what you could have figured out for yourself a lot faster. Then maybe the next question is, was there any correlation between genetic varieties on that field and production output yield per acre? Maybe there is, maybe there isn't. Those are the sorts of things that I think-- we're not there yet, but I think those are the sorts of things that are going to be really interesting for us moving forward, and these new model architectures are going to give us the ability to start thinking about it.
This machine is equipped with stereo cameras. There's 4 on each side, so a total of 16 cameras on it. This is the evolution of camera technology for us if you think about it, pixels per dollar. How much resolution do you get in the camera for a dollar that you would spend on that camera? You can continue to see the trend here is encouraging. We're continuing to drive costs down on these sensing devices while performance is improving. Or said differently, we can get the same performance today for less money than we would have had to spend yesterday.
This is going to continue not because of John Deere, not because of agriculture, but because the world is trying to create sensing elements to do things automatically that it has historically done manually. It's not just happening in agriculture. The iPhone is driving a lot of this. The resolution and the miniaturization of components is being driven in a lot of respects by consumer electronics. It's not the only place that investment is happening and driving the costs of these things down. This is going to continue to happen over time. We're all trying to replicate sensing elements that replicate human ability to sense in some respects, and in some cases augment it, and we're trying to do that for less and less money as time goes on. This is going to continue to happen.
When we talked about the size, the resolution, and the size of that weed in See & Spray that we could detect, this is something you ought to keep in mind. The ability for those cameras to improve in resolution and improve the ability to detect things on crop or between crops in the case of weeds, or within crops in the case of weeds, is only going to increase. You can think of things like, can I detect whether or not a pest is in that crop? Can I see if that leaf has actually been bitten by an insect or not? Is there a worm on it or not? Those are the sorts of things that you're going to be able to sense in the future with products like this, with higher resolution and better compute from a camera perspective.
This one's a bit of an interesting chart. This is LiDAR. LiDAR is-- I'll unpack it briefly. LiDAR, neither one of these pieces of equipment are equipped with it today, but it is another sensing element that allows us to perceive the environment. We basically shine a light out in a lot of different directions and a lot of different vectors, if you want to think about it that way, and measure the reflection of that light back. We get an opportunity to understand from that sensor, distance to objects, and we can create something that's called a point cloud. It's a highly accurate way of assessing distance from things. We're not using this in row crop agriculture yet. We are using this in high-value crops and permanent crops. We're using this in an autonomous project we're working on for nut orchards today because we need that capability. That tractor and that sprayer and that application actually comes into contact with the trees all the time. Sensing humans amongst all of this noisy environment is a lot more difficult than it is assessing humans in a corn or soybean stubble field, and so we need the additional resolution that this provides in order to solve for the sensing capability that we need in that application.
A similar trend happens there as happens in the previous slide. Performance is continuing to improve while costs are continuing to come down. You're going to see that again with radar, same sort of trend happens. We don't have radar on this machine yet, but one of the applications we need to solve for with autonomy in agriculture is the ability to see through dust. Fields are dusty, no surprise to anybody, and if cameras can't see through the dust, we need something that can. This gets to be an interesting sensor modality for us to think about, and important as we think about exposing more applications to autonomous operation. How do we think about that with respect to the sensing capabilities, and where are they going with respect to cost and performance?
10 years ago, when we think about where resolution has gone and our ability to process those images, the image on the left is what we could have done at a given frame rate, a fixed frame rate. That was what we had the capability of doing 10 years ago, both camera resolution and the ability to compute it. The stuff on the right is what this machine will do today. This is significantly higher resolution and significantly more frames per second, right? Frames per second gives you the ability to respond quicker, act quicker, which is important in things like See & Spray where you want to increase the speed of spraying operations. You got to be able to process this stuff quicker.
That just tells you-- The reason I love this slide is, in a decade, which-- I'm 49 years old, so a decade doesn't seem that long to me anymore, but once upon a time, it did. In 10 years, which is not that long, we've gone from something that was completely not useful and never going to be possible, to something that's more than possible today. I think that's pretty cool, and I think the change in the next 10 years is not going to be linear. I think you saw the exponential curve on GPU processing capability. I think you ought to think about that exponential curve in the context of all this. I think in 10 years from today, the slide will not be big enough to accommodate all of the capability that we'll have to sense the environment around us and make sense of it from a compute perspective. All right. You all have been great. Thank you so much. I really appreciate it.
Tony Kramer: Please take a moment to subscribe to this podcast if you haven't already. You can subscribe to the show on the many different podcasting apps that we're streaming this out to such as Apple, Google, Spotify, as well as many others. While you're out there, drop us a review. We'd love to hear what you think about the show. Lastly, make sure to follow RDO Equipment Company on Facebook, Instagram, and X, and also catch our latest videos on YouTube. You can also follow me on X @RDOTonyK.