Nothing Special   »   [go: up one dir, main page]

WO2015038751A1 - Method to automatically estimate and classify spatial data for use on real time maps - Google Patents

Method to automatically estimate and classify spatial data for use on real time maps Download PDF

Info

Publication number
WO2015038751A1
WO2015038751A1 PCT/US2014/055160 US2014055160W WO2015038751A1 WO 2015038751 A1 WO2015038751 A1 WO 2015038751A1 US 2014055160 W US2014055160 W US 2014055160W WO 2015038751 A1 WO2015038751 A1 WO 2015038751A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
control unit
agricultural machine
sensing feedback
parameter sensing
Prior art date
Application number
PCT/US2014/055160
Other languages
French (fr)
Inventor
Jacob Van Bergeijk
Original Assignee
Agco Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Agco Corporation filed Critical Agco Corporation
Publication of WO2015038751A1 publication Critical patent/WO2015038751A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B79/00Methods for working soil
    • A01B79/005Precision agriculture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Forestry; Mining

Definitions

  • the present disclosure is generally related to agricultural production, and more particularly, to systems and methods of processing data captured by agricultural machines during operations.
  • Agricultural machines include sensors and other devices for capturing data during operation of the machines.
  • Combine harvesters may include sensors for sensing data relating to a crop as the crop is harvested, such as moisture content and yield.
  • the data may be stored for later use and/or may be processed in real time or substantially real time and displayed on a display device in the machine during operation. Some users prefer to receive such data during operation to monitor machine performance and other operational characteristics.
  • Machines capture data while travelling through a field at discrete intervals dictated by, for example, sensor sample rates and/or positioning sample rates.
  • the collected data is thus associated with particular discrete locations in the field corresponding to the locations of the machine in the field where the data was captured.
  • Variations in machine travel parameters such as speed and working width (the distance between consecutive parallel paths followed by the machine) can result in variations in spatial data density that make the data difficult to interpret if presented, for example, as a map.
  • FIG. 1 is a schematic diagram that illustrates an example environment in which an embodiment of an example parameter sensing feedback system is implemented.
  • FIG. 2 is a screen diagram that illustrates an example user interface from an operator's perspective for an embodiment of an example parameter sensing feedback system that populates the screen with polygons corresponding to a sensed parameter and demarcates each of the polygons with a border.
  • FIG. 3 is a screen diagram that illustrates an example user interface from an operator's perspective for another embodiment of an example parameter sensing feedback system that populates the screen with polygons corresponding to a sensed parameter and demarcates each of the polygons with a border.
  • FIG. 4A is a block diagram that illustrates an embodiment of an example parameter sensing feedback system.
  • FIG. 4B is a block diagram that illustrates an embodiment of an example control unit implemented in an embodiment of the example parameter sensing feedback system of FIG. 4A.
  • FIG. 5 is a flow diagram that illustrates an embodiment of an example parameter sensing feedback method. DESCRIPTION OF EXAMPLE EMBODIMENTS
  • a parameter sensing feedback method comprising: receiving updated position information while traversing a field; receiving data corresponding to a sensed parameter; processing the data using a geostatistical method; and presenting concurrently on a map and in real time a dynamically changing area rendering and an indicator corresponding to respective detected positions in the field, the area rendering associated with the processed data, the indicator overlapping at least a portion of the area rendering and comprising an alphanumeric value corresponding to the processed data.
  • a parameter sensing feedback system and associated method receive and process operational data and present the data visually in a form that is easily comprehended by an operator.
  • an agricultural machine hereinafter, also simply, machine
  • the parameter sensing feedback system "fills in" the area between data samples by estimating intermediate values using one or more geo-statistical methods.
  • the estimated values are represented on a screen display as dynamically changing polygons with respective indicators that correspond to the estimated values, the indicators at least partially overlapping the polygons.
  • each polygon changes shape (e.g., grows in size, or is generally altered in size due to updates based on the processing via the geostatistical methods and based on machine travel and associated sample points) as the machine advances across a field, as detected by position indication equipment on the machine.
  • the indicators may be embodied as a standalone or "boxed" (e.g., framed, such as part of a pop-up icon) alphanumeric value equal to the numerical estimated value, or as an alphanumeric equal to a class or range in which the estimated values fall under.
  • the estimated data allows a user to quickly and easily understand operational results across the work area.
  • parameter sensing feedback systems present a legend-less map based on processing sensed data corresponding to a given parameter (e.g., sensed crop or field conditions, sensed operational data, etc.) using geostatistical methods. For instance, the parameter sensing feedback system generates a contiguous visual representation of the estimated data in real time.
  • a given parameter e.g., sensed crop or field conditions, sensed operational data, etc.
  • the visual representation is presented to the machine operator in real time or substantially real time to enable the operator to review the information during operation (e.g., providing feedback to the operator of the sensed parameter or parameters).
  • the processed or estimated data may be mapped onto an area corresponding to the working area of the machine.
  • the parameter sensing feedback system rather than a map of the work area illustrating values only at discrete locations on the map, the parameter sensing feedback system generates a map that is covered or substantially covered with a visual illustration of alphanumeric values (e.g., labels, range of estimated values, discrete estimated values) corresponding to each point on the map.
  • polygon an example area rendering corresponding to the estimated data generated through application of one or more geostatistical methods to sensed data
  • any area rendering of the estimated data onto the map may be used.
  • the area rendering may also comprise arbitrarily and/or irregularly-shaped areas (e.g., without defined boundaries) and/or areas with curved boundaries.
  • FIG. 1 shown is an example environment in which an embodiment of a parameter sensing feedback system 10 may be used.
  • the parameter sensing feedback system 10 is shown as functionality residing within an agricultural machine 12 (hereinafter, simply referred to as a machine) depicted as a combine harvester for illustration.
  • the parameter sensing feedback system 10 includes one or more control units, a position indication component (e.g., global navigation satellite system, or GNSS, receiver), a sensor system to detect one or more parameters (e.g., operational and/or crop or soil parameters, such as moisture, kernel damage, grain yield, fluid levels, fuel level, chemical production and/or amounts, planting depth, planting population, etc.), and a screen display for rendering dynamically changing polygons representing the data for the respective detected properties at defined sampling points.
  • the parameter sensing feedback system 10 may include additional components, fewer components, or other components.
  • the position indication component enables a determination (e.g., based on comparison to a locally-stored or remotely accessed map) of the current position of the machine 12.
  • the control unit executes one or more geostatistical methods to process sampled spatial data (e.g., as detected by the sensor system) to be rendered on the screen display in the form of dynamically changing (e.g., changes in shape) polygons.
  • the control unit generates a contiguous real time map for the area covered by (and optionally, to be covered by) the machine 12.
  • the machine 12 is primarily described, and always depicted, as a combine harvester, other agricultural machines used for the same or different operations may be used in some embodiments. Further, it is noted that the machine 12 is shown in FIG.
  • the parameter sensing feedback system 10 is shown residing within a cab of the machine 12 for illustration, but in some embodiments, one or more functionality of the parameter sensing feedback system 10, as explained above and further below, may be distributed throughout the machine 12, distributed among plural machines, and/or located remotely, such as in one or more computing systems, such as computing system 14.
  • the computing system 14 may be embodied as a server, or other computing device, that is located remotely from the machine 12 and is communicatively coupled to the machine 12 over a network 16.
  • the computing system 14 may include other equipment (e.g., gateways, routers, switches, etc.), with functionality distributed among one or more facilities, such as an Internet Service Provider (ISP) facility, regional or local machine manufacturer's representative facility, manufacturer's facility, residence, among other facilities.
  • ISP Internet Service Provider
  • the computing system 14 may store and update one or more data structures (e.g., databases) of geographical information (e.g., maps, including field boundary coordinates, topographic information, etc.) for fields farmed using the machine 12 or other machines.
  • control unit may be performed by the computing system 14.
  • the data for the sensed parameter may be communicated to the computing system 14, for example, which may process the sensed data.
  • Processing of the data includes applying geostatistical methods to the received (sensed) data and generating the estimated data.
  • the computing system 14 When the computing system 14 has performed the processing, it communicates the processed (e.g., estimated) data back to the machine 12.
  • the network 16 may include one or more networks based on one or a plurality of communication protocols.
  • the network 16 may comprise a wide area network, such as the Internet, one or more local area networks, such as a radio frequency (RF) network, a cellular network, POTS, WiFi, WiMax, and/or other networks, such as a satellite network.
  • the computing system 14 may host a web-service, or serve as a gateway to one or more other servers in the Internet (e.g., as a gateway to a cloud service), and be coupled to the machine 12 (e.g., via a communications interface) over a wireless, cellular connection.
  • the machine 12 comprises a position indication component, as explained below, that is coupled to a satellite network.
  • FIGS. 2-3 depict example screen displays 18 of an embodiment of a parameter sensing feedback system 10.
  • the screen displays 18 are coupled to the aforementioned control unit, and provide the operator with feedback of one or more sensed parameters as the machine 12 traverses a field. It should be appreciated by one having ordinary skill in the art that the screen displays 18 are for illustration, and that in some embodiments, additional, fewer, or different data may be presented on the screen display 18.
  • the parameter sensing feedback system 10 may adjust or assign estimated spatial values to a plurality of classes (e.g., ranges of estimated values for a given sensed parameter).
  • the classes may be automatically generated or, in some embodiments, may be provided by an operator.
  • each class may correspond to a range of estimated yield values such as bushels per acre. If the sensed and estimated data relates to moisture content, each class may correspond to a range of estimated moisture content percentages. Other properties may be used, as explained above.
  • a map 20 is presented on the screen display 18, such as during farming operations implemented by the machine 12. The map 20 illustrates, in this example, two distinct regions, one being a field 22 on which the machine 12 works, the other being a region comprising a plurality of polygons 24 (e.g., 24A, 24B, 24C, 24D, 24E, etc.).
  • the polygons 24 represent processed data (e.g., estimated data generated based on geostatistical methods) corresponding to one or more parameters sensed by a sensor system of, or associated with, the parameter sensing feedback system 10.
  • the polygons 24 also correspond to the areas of the field 22 worked by the machine 12 and for which work has been completed.
  • Each polygon 24 is assigned a given class (e.g., automatically or operator-configured). Differences in class may be represented on the map 20 by any one or more of a plurality of methods. For instance, a polygon 24 corresponding to one class may have a different color, pattern, and/or alphanumeric located within the polygon, when compared to a polygon 24 assigned to a different class.
  • a leading edge icon 26 which has an arrow indicating the direction of movement of the machine 12 as well as illustrating the change in size of a polygon 24 (or as an indicator of commencement of a new polygon 24) as the machine 12 advances along the field 22 within an area of a same class or to a different class, respectively.
  • additional and/or other icons may be used to convey the advancing border of the polygon, such as a straight edge without an arrow, an icon representing a likeness of the machine 12, among other icons.
  • the leading edge icon 26 merely illustrates the advancement of the machine 12 within the same class or into a different class, and that the size of each polygon 24 may change along other sides as the map 20 is updated through the continued processing of data.
  • the actual value of A may be presented in an area overlapping at least a portion of the polygon 24A responsive to operator input. For instance, an operator may physically touch the screen display 18 at the location of the polygon 24A, which in touch-screen type technology, prompts a signal that represents the coordinate value on the screen, prompting the control unit to present the estimated value or range of values of the class A in an area within or at least partially overlapping the polygon 24A. In some embodiments, the operator may maneuver a cursor over the polygon 24A, or verbally request a displayed value for the corresponding label A in polygon 24A in some embodiments to achieve the same or similar effect.
  • a numeric value e.g., estimated value or estimated average value
  • an alphanumeric comprising a range of values e.g., 100-1 10 bushels per acre
  • no persistently displayed alphanumeric is presented, where distinctions in class are merely shown with differences in color and/or pattern, and where the operator may invoke a transitory presentation of the range or estimated value in a manner as described above.
  • an icon may be presented in each polygon 24A to visually illustrate the value of the parameter corresponding to the polygon 24.
  • a graphic of a bushel may be presented with a volume of material shown in a manner commensurate with the estimated value relative to a total span of ranges (e.g. , mid-range may show a bushel half-full).
  • the alphanumeric e.g. , A, B, C, etc.
  • numeric values average estimated value, median estimated value, etc.
  • range of estimated values and pop-up icons or other icons presented in the polygon 24 or at least overlapping a portion of the associated polygon 24, are collectively referred to herein as indicators.
  • the alphanumeric may be presented as a standalone value, or framed according to any of a plurality of graphic formats (e.g. , highlighted, underlined, etc.).
  • the polygon 24A like other polygons 24, is optionally framed along its periphery with a border 28, which further demarcates the shape of the polygon 24A and facilitates visual distinctiveness with adjacent polygons 24 and/or the un-worked field 22.
  • the polygon 24A and its associated class "A” another type of alphanumeric and/or icon may be used in lieu of (or supplemental to) "B.”
  • the estimated value or range of estimated values corresponding to class B may be prompted for display within, or otherwise overlapping at least in part, the polygon 24B through one or more actions by the operator.
  • the polygon 24B is likewise presented with a border 30 along the periphery of the polygon 24B, which facilitates visual distinctiveness among adjacent polygons 24 and/or the un-worked field 22 as well as any change in shape. Note that the various mechanisms described above to distinguish classes may be used alone or in different combinations.
  • each polygon 24 with associated different classes may be distinguished using one or any combination of different colors, shadings, fill patterns, icons, and/or alphanumeric (the latter invoked by the operator or persistently displayed).
  • the data collected by the machine 12 changes over the course of the working path, the data is assigned to different classes.
  • the data is assigned to classes A, then B, then C (polygon 24C), then B (polygon 24D) and then A (polygon 24E).
  • Classes B and C represent different estimated values or estimated average values or different ranges of estimated values, such as 1 1 1 -120 and 121 -130, respectively. Note that the values provided are merely examples, and other values and/or respective span in individual ranges may be used.
  • the machine traverses the second working path 32 (e.g. , in the direction as shown by the arrow of the icon 26), additional data is processed to generate intermediate estimated values.
  • the new estimated values are also placed into classes, and the visual areas (e.g. , polygons 24) associated with each class change shape (e.g. , expand), such as the large A area near the top corresponding to polygon 24E and polygon 24D corresponding to class B adjacent to the polygon 24E.
  • the polygons 24 corresponding to the different classes may increase in size or, generally, change shape.
  • the data classification information such as an estimated value or range of estimated values corresponding to each class may be configured by the operator or may be automatically generated based on the estimated data.
  • the parameter sensing feedback system 10 may generate multiple classes, for example, corresponding to uniform divisions between the lowest and highest estimated values.
  • the class values may be visible on the map 20 itself (e.g., persistently displayed) or may be presented temporarily via a pop-up window or user interface tab when selected by the operator.
  • the operator may adjust the granularity of the classes and areas at any time via a user interface element such as a button or knob.
  • the parameter sensing feedback system 10 may present each estimated data point on the screen display 18.
  • the estimated data may be mapped to the area worked by the machine 12 and/or may be mapped to areas beyond the area worked by the machine 12.
  • the parameter sensing feedback system 10 may generate estimated data corresponding to a band of a designated or predetermined width around an outer edge of the worked area. Furthermore, the parameter sensing feedback system 10 may generate estimated data corresponding to a proximate row or rows.
  • the map 20 is presented on the screen display 18 with the persistently labeled classes (A, B, C) for each of the polygons 24, where an indicator such as a pop-up icon 34 is invoked by the operator as shown for polygon 24C.
  • an indicator such as a pop-up icon 34 is invoked by the operator as shown for polygon 24C.
  • the operator in this example seeks to learn of the estimated value or range of estimated values for class C, and in one embodiment, touches the screen 18 in a location over the polygon 24C to invoke the pop-up icon 34 (or invokes the pop-up icon 34 using other user interface methods, as previously described).
  • the pop-up icon 34 replaces the label C (though in some embodiments, the label C may remain), and includes a range of estimated yield values for class C associated with polygon 24C, wherein the pop-up icon 34 is presented entirely within the polygon 24C. In some embodiments, only a portion of the pop-up icon 34 is overlapping the polygon 24C. Note that, as described above, other mechanisms for visually distinguishing and/or representing the associated class for each polygon 24 may be used in conjunction with the pop-up icon 34 in some embodiments, and in some embodiments, the pop-up icon 34 may be persistently displayed in lieu of any alphanumeric and/or other icons.
  • FIG. 4A illustrates an embodiment of a parameter sensing feedback system 10.
  • the parameter sensing feedback system 10 may be distributed among plural machines. For instance, functionality of the parameter sensing feedback system 10 may be distributed among a towing machine and a towed machine, such as when sensing the chemical output (e.g., sensed parameter) of a towed implement. As another example, multiple machines may operate in the same field at the same time.
  • Parameters may be sensed among the multiple machines, and shared via an ad hoc network among the multiple machines, or via communication over the network 16 to and from the computing system 14 (e.g., serving as a cloud system or Internet server for each machine).
  • Parameter sensing feedback software in each machine may apply the geostatistical methods based on the data received from one or more of the machines to generate estimated data, in addition to the host machine, and render the dynamically changing polygons based on the estimated data onto the map of the respective screen display 18.
  • the estimated data (in lieu of the sensed data) may be communicated among the machines for rendering of polygons on each respective screen display 18.
  • the parameter sensing feedback system 10 comprises one or more control units, such as the control unit 36.
  • the control unit 36 is coupled via one or more networks, such as network 38 (e.g., a CAN network or other network, such as a network in conformance to the ISO 1 1783 standard, also referred to as "Isobus"), to a position indication component 40 (e.g., which may include one or more receivers that include the ability to access one or more constellations jointly or separately via a global navigation satellite system (GNSS), such as global positioning systems (GPS), GLONASS, Galileo, among other constellations), a user interface 42 (which in one embodiment includes the screen display 18), a network interface 44, and one or more sensors of a sensor system 46.
  • GNSS global navigation satellite system
  • GPS global positioning systems
  • GLONASS global positioning systems
  • Galileo Galileo
  • the position indication component 40 comprises a GNSS receiver that continually updates the control unit 36 with real time position information that indicates a current geographical position of the machine 12, which the control unit 36 compares to a map (e.g., geographical coordinates within a defined region from where the machine 12 is operating) to enable the presentation of machine operations onto the map.
  • the user interface 42 may include one or more of a keyboard, mouse, microphone, touch-type display device, such as the screen display 18, joystick, steering wheel, or other devices (e.g., switches, immersive head set, etc.) that enable input and/or output by an operator (e.g., to prompt indicators onto a map).
  • the screen display 18 may be a component of the user interface 42.
  • the network interface 44 comprises hardware and/or software that enable wireless connection to the network 16 (FIG. 1 ).
  • the network interface 44 may cooperate with browser software or other software of the control unit 36 to communicate with the computing system 14 (FIG. 1 ), such as via cellular links, among other telephony communication mechanisms and radio frequency communications.
  • the computing system 14 may host a cloud service, whereby all or a portion of the functionality of the control unit 36 resides on the computing system 14 and is accessed by the control unit 36 via the network interface 44.
  • the computing system 14 may receive position information from the control unit 36 (via the network interface 44) and based on geographic information stored at, or in association with, the computing system 14, estimate data based on one or more sensed properties for communication back to the control unit 36 for display on the screen display 18.
  • the network interface 44 may comprise MAC and PHY components (e.g., radio circuitry, including transceivers, antennas, etc.), as should be appreciated by one having ordinary skill in the art.
  • the network interface 44 may enable wireless communication (e.g., sharing of data) among multiple machines operating in the same field or generally within wireless range, such as in an ad hoc network. Multiple machines may also communicate with each other via the network interface 44 and the cloud hosted by the computing system 14.
  • the sensor system 46 may comprise one or more sensors of the machine 12 to sense machine and/or field parameters, such as grain yield, crop damage, moisture, fluid level, fuel level, chemical production, among other parameters for which an operator of the machine may wish to have a visual feedback.
  • the sensors of the sensor system 46 may be embodied as contact (e.g., electromechanical sensors, such as position sensors, safety switches, etc.) and non-contact type sensors (e.g., photo-electric, inductive, capacitive, ultrasonic, etc.), all of which comprise known technology.
  • control unit 36 is configured to receive and process information from the network interface 44, the position indication component 40, the sensor system 46, and/or the user interface 42.
  • the control unit 36 may receive input from the user interface 42 (e.g., screen display 18), such as to enable the operator to prompt (and/or acknowledge) a pop-up icon or other indicator presented in overlapping manner on polygons of a map.
  • the control unit 36 receives data from the sensor system 46 and processes the data using one or more geostatistical methods (e.g., kriging, inverse distance weighting, etc.) to generate (e.g., extrapolate or generally, fill-in) estimated data based on the data received from the sensor system 46.
  • geostatistical methods e.g., kriging, inverse distance weighting, etc.
  • the control unit 36 presents the map and a plurality of polygons corresponding to the processed data on the map, while framing the periphery of each polygon with a border in some embodiments (though some embodiments may omit the border for all or a portion of the polygons) and presenting on each polygon, or at least partially overlapping each polygon, an indicator of the estimated value(s) in the form of an alphanumeric (e.g., estimated value, label, etc.) and/or icon (e.g., pop-up icon, symbolic icon, etc.).
  • the indicator may be transitory in nature (e.g., invoked by an operator) or persistently displayed.
  • the control unit 36 is also configured to cause the transmission of information (and/or enable the reception of information) via the network interface 44 for communication with the computing system 14, as set forth above.
  • FIG. 4B further illustrates an example embodiment of the control unit 36.
  • the example control unit 36 is merely illustrative, and that some embodiments of control units may comprise fewer or additional components, and/or some of the functionality associated with the various components depicted in FIG. 4B may be combined, or further distributed among additional modules, in some embodiments. It should be appreciated that, though described in the context of residing in the machine 12, in some embodiments, the control unit 36, or all or a portion of its corresponding functionality, may be implemented in a computing device or system (e.g., computing system 14) located external to the machine 12, or distributed among plural machines in some embodiments. Referring to FIG.
  • control unit 36 is depicted in this example as a computer system, but may be embodied as a programmable logic controller (PLC), field programmable gate array (FPGA), application specific integrated circuit (ASIC), among other devices. It should be appreciated that certain well-known components of computer systems are omitted here to avoid obfuscating relevant features of the control unit 36.
  • the control unit 36 comprises one or more processors (also referred to herein as processor units or processing units), such as processor 48, input/output (I/O) interface(s) 50, and memory 52, all coupled to one or more data busses, such as data bus 54.
  • processors also referred to herein as processor units or processing units
  • processor 48 input/output (I/O) interface(s) 50
  • memory 52 all coupled to one or more data busses, such as data bus 54.
  • the memory 52 may include any one or a combination of volatile memory elements (e.g., random-access memory RAM, such as DRAM, and SRAM, etc.) and nonvolatile memory elements (e.g., ROM, hard drive, tape, CDROM, etc.).
  • volatile memory elements e.g., random-access memory RAM, such as DRAM, and SRAM, etc.
  • nonvolatile memory elements e.g., ROM, hard drive, tape, CDROM, etc.
  • the memory 52 may store a native operating system, one or more native applications, emulation systems, or emulated applications for any of a variety of operating systems and/or emulated hardware platforms, emulated operating systems, etc.
  • the memory 52 may store geographical information, such as one or more field maps (e.g., geographical coordinates of the entire field).
  • the geographical information may include topographic feature of the fields in some embodiments.
  • the field maps may be in the form of aerial imagery or recorded geographical coordinates of one or more fields, including recorded entry points, identified boundaries of the one or more fields, paths or waylines previously determined, customizations, and other data pertinent to farming.
  • the geographical information may be stored remotely (e.g., at the computing system 14), or stored in distributed manner (e.g., in memory 52 and remotely).
  • the memory 52 comprises an operating system 56, and parameter sensing feedback (PSF) software 58.
  • PSF parameter sensing feedback
  • a separate storage device may be coupled to the data bus 54, such as a persistent memory (e.g., optical, magnetic, and/or semiconductor memory and associated drives).
  • a persistent memory e.g., optical, magnetic, and/or semiconductor memory and associated drives
  • the parameter sensing feedback software 58 as executed by the processor 48, is described below as performing certain functions, though it should be appreciated by one having ordinary skill in the art in the context of the present disclosure that some functions may be offloaded by other software and provided as input to the parameter sensing feedback software 58 in some embodiments.
  • the parameter sensing feedback software 58 receives position information (e.g., from the position indication component 40) and compares the position information with geographic information stored locally or remotely and determines the position of the machine 12 in the field.
  • the parameter sensing feedback software 58 also receives data corresponding to a sensed parameter from the sensor system 46. For instance, the data is sampled at defined times and/or positions of the machine 12 and provided to the parameter sensing feedback software 58.
  • the parameter sensing feedback software 58 processes these samples (data) to produce a real time map with a set of contiguous classes for the sensed parameters.
  • the parameter sensing feedback system 10 processes the data by applying geostatistical methods to process the sampled spatial data to be rendered in a contiguous real time map for the area covered by the machine 12 and/or coupled implement (e.g., header, towed implement, etc.).
  • the geostatistical method or methods used may be selected and configured by an operator of the machine 12, or in some embodiments, automatically selected. Additional processing includes filtering the data, and/or other signal conditioning.
  • new spatial data gathered by the sensor system 46 on, or associated with, the machine 12 is input to a geostatistical algorithm (or algorithms) of the parameter sensing feedback software 58 to update the real time map using a user interface (e.g., graphical user interface) component to the parameter sensing feedback software 58 or as communicated to a separate user interface software component.
  • Updating includes adjusting the classification of the data over either an operator-configurable set of classes or an automatic classification.
  • the area onto which the processed data is mapped (e.g., in the form of polygons with an optional border and associated indicator) can be synchronized with the worked area or configured to predict to a certain band around the worked area.
  • the data classification values can be configured by the operator to be visible on the map itself (e.g. , persistent) or as a pop-up icon (e.g., temporary or transitory) as a detailed data information box when the operator touches an area of the map on the screen display 18 or otherwise invokes using other user interface mechanisms.
  • Execution of the parameter sensing feedback software 58 may be implemented by the processor 48 under the management and/or control of the operating system 56.
  • the operating system 56 may be omitted and a more rudimentary manner of control implemented.
  • the processor 48 may be embodied as a custom-made or commercially available processor, a central processing unit (CPU) or an auxiliary processor among several processors, a semiconductor based microprocessor (in the form of a microchip), a macroprocessor, one or more application specific integrated circuits (ASICs), a plurality of suitably configured digital logic gates, and/or other well-known electrical configurations comprising discrete elements both individually and in various combinations to coordinate the overall operation of the control unit 36.
  • CPU central processing unit
  • ASICs application specific integrated circuits
  • the I/O interfaces 50 provide one or more interfaces to the network 38 and other networks.
  • the I/O interfaces 50 may comprise any number of interfaces for the input and output of signals (e.g., analog or digital data) for conveyance of information (e.g., data) over the network 38.
  • the input may comprise input by an operator (local or remote) through the user interface 42 and input from signals carrying information from one or more of the components of the parameter sensing feedback system 10, such as the position indication component 40, the sensor system 46, and/or the network interface 44, among other devices.
  • control unit 36 When certain embodiments of the control unit 36 are implemented at least in part with software (including firmware), as depicted in FIG. 4B, it should be noted that the software can be stored on a variety of non-transitory computer-readable medium for use by, or in connection with, a variety of computer-related systems or methods.
  • a computer-readable medium may comprise an electronic, magnetic, optical, or other physical device or apparatus that may contain or store a computer program (e.g., executable code or instructions) for use by or in connection with a computer-related system or method.
  • the software may be embedded in a variety of computer-readable mediums for use by, or in connection with, an instruction execution system, apparatus, or device, such as a computer- based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions.
  • an instruction execution system, apparatus, or device such as a computer- based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions.
  • control unit 36 When certain embodiment of the control unit 36 are implemented at least in part with hardware, such functionality may be implemented with any or a combination of the following technologies, which are all well-known in the art: a discrete logic circuit(s) having logic gates for implementing logic functions upon data signals, an application specific integrated circuit (ASIC) having appropriate combinational logic gates, a programmable gate array(s) (PGA), a field programmable gate array (FPGA), etc.
  • ASIC application specific integrated circuit
  • PGA programmable gate array
  • FPGA field programmable gate array
  • references to “one embodiment”, “an embodiment”, or “embodiments” mean that the feature or features being referred to are included in at least one embodiment of the technology.
  • references to “one embodiment”, “an embodiment”, or “embodiments” in this description do not necessarily refer to the same embodiment and are also not mutually exclusive unless so stated and/or except as will be readily apparent to those skilled in the art from the description.
  • a feature, structure, act, etc. described in one embodiment may also be included in other embodiments, but is not necessarily included.
  • the present technology can include a variety of combinations and/or integrations of the embodiments described herein.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Soil Sciences (AREA)
  • Environmental Sciences (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

In one embodiment, a parameter sensing feedback method, comprising: receiving updated position information while traversing a field; receiving data corresponding to a sensed parameter; processing the data using a geostatistical method; presenting concurrently on a map and in real time a dynamically changing area rendering and an indicator corresponding to respective detected positions in the field, the area rendering associated with the processed data, the indicator overlapping at least a portion of the area rendering and comprising an alphanumeric value corresponding to the processed data.

Description

METHOD TO AUTOMATICALLY ESTIMATE AND CLASSIFY SPATIAL DATA FOR USE ON REAL TIME MAPS
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the benefit of U.S. Provisional Application Nos. 61/917,962, filed December 19, 2013, and 61/843,620, filed September 13, 2013, both of wh ich are hereby i ncorporated by reference in thei r enti rety.
TECHNICAL FIELD
[0002] The present disclosure is generally related to agricultural production, and more particularly, to systems and methods of processing data captured by agricultural machines during operations.
BACKGROUND
[0003] Agricultural machines include sensors and other devices for capturing data during operation of the machines. Combine harvesters, for example, may include sensors for sensing data relating to a crop as the crop is harvested, such as moisture content and yield. The data may be stored for later use and/or may be processed in real time or substantially real time and displayed on a display device in the machine during operation. Some users prefer to receive such data during operation to monitor machine performance and other operational characteristics.
[0004] Presenting captured machine data in real time presents certain challenges. Machines capture data while travelling through a field at discrete intervals dictated by, for example, sensor sample rates and/or positioning sample rates. The collected data is thus associated with particular discrete locations in the field corresponding to the locations of the machine in the field where the data was captured. Variations in machine travel parameters, such as speed and working width (the distance between consecutive parallel paths followed by the machine) can result in variations in spatial data density that make the data difficult to interpret if presented, for example, as a map.
BRIEF DESCRIPTION OF THE DRAWINGS
[0005] Many aspects of the disclosure can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
[0006] FIG. 1 is a schematic diagram that illustrates an example environment in which an embodiment of an example parameter sensing feedback system is implemented.
[0007] FIG. 2 is a screen diagram that illustrates an example user interface from an operator's perspective for an embodiment of an example parameter sensing feedback system that populates the screen with polygons corresponding to a sensed parameter and demarcates each of the polygons with a border.
[0008] FIG. 3 is a screen diagram that illustrates an example user interface from an operator's perspective for another embodiment of an example parameter sensing feedback system that populates the screen with polygons corresponding to a sensed parameter and demarcates each of the polygons with a border.
[0009] FIG. 4A is a block diagram that illustrates an embodiment of an example parameter sensing feedback system.
[0010] FIG. 4B is a block diagram that illustrates an embodiment of an example control unit implemented in an embodiment of the example parameter sensing feedback system of FIG. 4A.
[0011] FIG. 5 is a flow diagram that illustrates an embodiment of an example parameter sensing feedback method. DESCRIPTION OF EXAMPLE EMBODIMENTS
Overview
[0012] In one embodiment, a parameter sensing feedback method, comprising: receiving updated position information while traversing a field; receiving data corresponding to a sensed parameter; processing the data using a geostatistical method; and presenting concurrently on a map and in real time a dynamically changing area rendering and an indicator corresponding to respective detected positions in the field, the area rendering associated with the processed data, the indicator overlapping at least a portion of the area rendering and comprising an alphanumeric value corresponding to the processed data.
Detailed Description
[0013] Certain embodiments of a parameter sensing feedback system and associated method are disclosed that receive and process operational data and present the data visually in a form that is easily comprehended by an operator. For instance, an agricultural machine (hereinafter, also simply, machine) captures data at spatial intervals and uses one or more geo-statistical methods to process the sampled data, resulting in the generation of a continuous spatial estimation of the data based on the captured data. In other words, the parameter sensing feedback system "fills in" the area between data samples by estimating intermediate values using one or more geo-statistical methods. The estimated values are represented on a screen display as dynamically changing polygons with respective indicators that correspond to the estimated values, the indicators at least partially overlapping the polygons. For instance, each polygon changes shape (e.g., grows in size, or is generally altered in size due to updates based on the processing via the geostatistical methods and based on machine travel and associated sample points) as the machine advances across a field, as detected by position indication equipment on the machine. The indicators may be embodied as a standalone or "boxed" (e.g., framed, such as part of a pop-up icon) alphanumeric value equal to the numerical estimated value, or as an alphanumeric equal to a class or range in which the estimated values fall under. When presented in a visual form, the estimated data allows a user to quickly and easily understand operational results across the work area.
[0014] Digressing briefly, current systems on agricultural machines that process data to be displayed on a real time map display the data at discrete intervals detected by a position indication component and at the sensor sample rates. With this real time mapping method, variations in travel speed and working width cause spatial data density variations that may make the map difficult to interpret by an operator. One method to mitigate this shortcoming is to sample the data at fixed distance intervals, which reduces variations due to speed differences but does not eliminate issues related to working width adjustments. Some systems that have used distance traveled since a prior data sample point and working width to render coverage polygons also face certain challenges. For instance, when the working width is inaccurate or the data source produces data samples that contain relatively high noise from sample to sample, the interpretation of spatial data on the real time map becomes difficult. That is, noise present in the sampled data may negatively affect readability of the real time map. Further, the use of legends to interpret the map adds a further visual complexity to the system while consuming screen space separate from the map that otherwise may be better utilized for the map. In contrast to the conventional systems described above, certain embodiments of parameter sensing feedback systems present a legend-less map based on processing sensed data corresponding to a given parameter (e.g., sensed crop or field conditions, sensed operational data, etc.) using geostatistical methods. For instance, the parameter sensing feedback system generates a contiguous visual representation of the estimated data in real time. The visual representation is presented to the machine operator in real time or substantially real time to enable the operator to review the information during operation (e.g., providing feedback to the operator of the sensed parameter or parameters). The processed or estimated data may be mapped onto an area corresponding to the working area of the machine. Thus, rather than a map of the work area illustrating values only at discrete locations on the map, the parameter sensing feedback system generates a map that is covered or substantially covered with a visual illustration of alphanumeric values (e.g., labels, range of estimated values, discrete estimated values) corresponding to each point on the map.
[0015] Having summarized certain features of parameter sensing feedback systems of the present disclosure, reference will now be made in detail to the description of the disclosure as illustrated in the drawings. While the disclosure will be described in connection with these drawings, there is no intent to limit it to the embodiment or embodiments disclosed herein. For instance, in the description that follows, one focus is on an agricultural machine depicted in the figures as a self-propelled combine harvester, though it should be appreciated that some embodiments may use other agricultural machines (e.g., for tilling, planting, mowing, water or chemical disbursement, towing an implement, etc.), and hence are contemplated to be within the scope of the disclosure. Further, although the description identifies or describes specifics of one or more embodiments, such specifics are not necessarily part of every embodiment, nor are all various stated advantages necessarily associated with a single embodiment or all embodiments. On the contrary, the intent is to cover all alternatives, modifications and equivalents included within the spirit and scope of the disclosure as defined by the appended claims. Further, it should be appreciated in the context of the present disclosure that the claims are not necessarily limited to the particular embodiments set out in the description.
[0016] Note that references hereinafter made to certain directions, such as, for example, "front", "rear", "left" and "right", are made as viewed from the rear of the machine looking forwardly. Further, though the description below uses the term "polygon" as an example area rendering corresponding to the estimated data generated through application of one or more geostatistical methods to sensed data, it should be appreciated that any area rendering of the estimated data onto the map may be used. For instance, aside from the traditional connotations of a polygon, the area rendering may also comprise arbitrarily and/or irregularly-shaped areas (e.g., without defined boundaries) and/or areas with curved boundaries.
[0017] Referring now to FIG. 1 , shown is an example environment in which an embodiment of a parameter sensing feedback system 10 may be used. In particular, the parameter sensing feedback system 10 is shown as functionality residing within an agricultural machine 12 (hereinafter, simply referred to as a machine) depicted as a combine harvester for illustration. In one embodiment, the parameter sensing feedback system 10 includes one or more control units, a position indication component (e.g., global navigation satellite system, or GNSS, receiver), a sensor system to detect one or more parameters (e.g., operational and/or crop or soil parameters, such as moisture, kernel damage, grain yield, fluid levels, fuel level, chemical production and/or amounts, planting depth, planting population, etc.), and a screen display for rendering dynamically changing polygons representing the data for the respective detected properties at defined sampling points. In some embodiments, the parameter sensing feedback system 10 may include additional components, fewer components, or other components. The position indication component enables a determination (e.g., based on comparison to a locally-stored or remotely accessed map) of the current position of the machine 12. The control unit executes one or more geostatistical methods to process sampled spatial data (e.g., as detected by the sensor system) to be rendered on the screen display in the form of dynamically changing (e.g., changes in shape) polygons. In other words, the control unit generates a contiguous real time map for the area covered by (and optionally, to be covered by) the machine 12. It should be appreciated within the context of the present disclosure that, though the machine 12 is primarily described, and always depicted, as a combine harvester, other agricultural machines used for the same or different operations may be used in some embodiments. Further, it is noted that the machine 12 is shown in FIG. 1 without the attached implement (e.g., header of the combine) for purposes of brevity, with the understanding that one of a plurality of different types of headers (or other implements, depending on the machine) may be used. The parameter sensing feedback system 10 is shown residing within a cab of the machine 12 for illustration, but in some embodiments, one or more functionality of the parameter sensing feedback system 10, as explained above and further below, may be distributed throughout the machine 12, distributed among plural machines, and/or located remotely, such as in one or more computing systems, such as computing system 14.
[0018] The computing system 14 may be embodied as a server, or other computing device, that is located remotely from the machine 12 and is communicatively coupled to the machine 12 over a network 16. Note that the computing system 14 may include other equipment (e.g., gateways, routers, switches, etc.), with functionality distributed among one or more facilities, such as an Internet Service Provider (ISP) facility, regional or local machine manufacturer's representative facility, manufacturer's facility, residence, among other facilities. In some embodiments, the computing system 14 may store and update one or more data structures (e.g., databases) of geographical information (e.g., maps, including field boundary coordinates, topographic information, etc.) for fields farmed using the machine 12 or other machines. Other data may be stored, such as the manufacturer of the machine 12 or other machines used on the field, the product dispensed on the field (e.g., in the case of planting or spraying applications), among other useful data. In some embodiments, some or all of the functionality performed by the control unit may be performed by the computing system 14. For instance, the data for the sensed parameter may be communicated to the computing system 14, for example, which may process the sensed data. Processing of the data (e.g., whether at the control unit or the computing system 14 or both) includes applying geostatistical methods to the received (sensed) data and generating the estimated data. When the computing system 14 has performed the processing, it communicates the processed (e.g., estimated) data back to the machine 12.
[0019] The network 16 may include one or more networks based on one or a plurality of communication protocols. For instance, the network 16 may comprise a wide area network, such as the Internet, one or more local area networks, such as a radio frequency (RF) network, a cellular network, POTS, WiFi, WiMax, and/or other networks, such as a satellite network. In one embodiment, the computing system 14 may host a web-service, or serve as a gateway to one or more other servers in the Internet (e.g., as a gateway to a cloud service), and be coupled to the machine 12 (e.g., via a communications interface) over a wireless, cellular connection. The machine 12 comprises a position indication component, as explained below, that is coupled to a satellite network.
[0020] Having generally described the machine 12 and an embodiment of the parameter sensing feedback system 10, attention is directed to FIGS. 2-3, which depict example screen displays 18 of an embodiment of a parameter sensing feedback system 10. The screen displays 18 are coupled to the aforementioned control unit, and provide the operator with feedback of one or more sensed parameters as the machine 12 traverses a field. It should be appreciated by one having ordinary skill in the art that the screen displays 18 are for illustration, and that in some embodiments, additional, fewer, or different data may be presented on the screen display 18. The parameter sensing feedback system 10 may adjust or assign estimated spatial values to a plurality of classes (e.g., ranges of estimated values for a given sensed parameter). The classes may be automatically generated or, in some embodiments, may be provided by an operator. In the examples that follow, yield is the parameter being sensed and estimated, though operations for processing data for other parameters similarly apply. For example, each class may correspond to a range of estimated yield values such as bushels per acre. If the sensed and estimated data relates to moisture content, each class may correspond to a range of estimated moisture content percentages. Other properties may be used, as explained above. In FIGS. 2-3, with continued reference to FIG. 1 , a map 20 is presented on the screen display 18, such as during farming operations implemented by the machine 12. The map 20 illustrates, in this example, two distinct regions, one being a field 22 on which the machine 12 works, the other being a region comprising a plurality of polygons 24 (e.g., 24A, 24B, 24C, 24D, 24E, etc.). The polygons 24 represent processed data (e.g., estimated data generated based on geostatistical methods) corresponding to one or more parameters sensed by a sensor system of, or associated with, the parameter sensing feedback system 10. The polygons 24 also correspond to the areas of the field 22 worked by the machine 12 and for which work has been completed. Each polygon 24 is assigned a given class (e.g., automatically or operator-configured). Differences in class may be represented on the map 20 by any one or more of a plurality of methods. For instance, a polygon 24 corresponding to one class may have a different color, pattern, and/or alphanumeric located within the polygon, when compared to a polygon 24 assigned to a different class. Also shown on the map 20 is a leading edge icon 26, which has an arrow indicating the direction of movement of the machine 12 as well as illustrating the change in size of a polygon 24 (or as an indicator of commencement of a new polygon 24) as the machine 12 advances along the field 22 within an area of a same class or to a different class, respectively. In some embodiments, additional and/or other icons may be used to convey the advancing border of the polygon, such as a straight edge without an arrow, an icon representing a likeness of the machine 12, among other icons. Note that the leading edge icon 26 merely illustrates the advancement of the machine 12 within the same class or into a different class, and that the size of each polygon 24 may change along other sides as the map 20 is updated through the continued processing of data.
[0021] In the description associated with FIGS. 2-3, reference to actions taking place in the machine 12 as it travels along the actual field are described from the perspective of an operator viewing the movement of the machine 12 on the screen display 18, with grain yield (e.g., bushels per acre) used as the sensed parameter in the following description. In the examples depicted in FIGS. 2-3, as the machine 12 begins working the field 22, it moves along a section of the field 22 represented at the lower left of the map 20 and moves toward the upper left of the map 20. Data captured at the lower left portion of the field 22 is assigned to class "A," as represented by polygon 24A. Class A may represent a range of yield values, such as 100-1 10 bushels per acre. The actual value of A may be presented in an area overlapping at least a portion of the polygon 24A responsive to operator input. For instance, an operator may physically touch the screen display 18 at the location of the polygon 24A, which in touch-screen type technology, prompts a signal that represents the coordinate value on the screen, prompting the control unit to present the estimated value or range of values of the class A in an area within or at least partially overlapping the polygon 24A. In some embodiments, the operator may maneuver a cursor over the polygon 24A, or verbally request a displayed value for the corresponding label A in polygon 24A in some embodiments to achieve the same or similar effect. Note that although the class is denoted as alphanumeric "A," a numeric value (e.g., estimated value or estimated average value) may be used in place of (or supplemental to) "A," or in some embodiments, an alphanumeric comprising a range of values (e.g., 100-1 10 bushels per acre) corresponding to the class may be used in place of A (or supplemental to A). In some embodiments, no persistently displayed alphanumeric is presented, where distinctions in class are merely shown with differences in color and/or pattern, and where the operator may invoke a transitory presentation of the range or estimated value in a manner as described above. Note that in some embodiments, an icon may be presented in each polygon 24A to visually illustrate the value of the parameter corresponding to the polygon 24. For instance, in the case of yield, a graphic of a bushel may be presented with a volume of material shown in a manner commensurate with the estimated value relative to a total span of ranges (e.g. , mid-range may show a bushel half-full). Note that the alphanumeric (e.g. , A, B, C, etc.), which includes numeric values (average estimated value, median estimated value, etc.) and range of estimated values, and pop-up icons or other icons presented in the polygon 24 or at least overlapping a portion of the associated polygon 24, are collectively referred to herein as indicators. Note that the alphanumeric may be presented as a standalone value, or framed according to any of a plurality of graphic formats (e.g. , highlighted, underlined, etc.). Note also that the polygon 24A, like other polygons 24, is optionally framed along its periphery with a border 28, which further demarcates the shape of the polygon 24A and facilitates visual distinctiveness with adjacent polygons 24 and/or the un-worked field 22.
[0022] As the machine 12 moves along the field 22 (from the bottom toward the top in the FIGS. 2-3), the machine 12 gets to a point where it begins assigning data to class "B," as represented by polygon 24B. Note that in some embodiments, there may be different classes concurrently existing along the same implement (e.g. , class A in one area covered by the implement, class B in another area covered by the implement). As explained for the polygon 24A and its associated class "A," another type of alphanumeric and/or icon may be used in lieu of (or supplemental to) "B." Similar to the example for polygon 24A, the estimated value or range of estimated values corresponding to class B may be prompted for display within, or otherwise overlapping at least in part, the polygon 24B through one or more actions by the operator. Like all other polygons 24, the polygon 24B is likewise presented with a border 30 along the periphery of the polygon 24B, which facilitates visual distinctiveness among adjacent polygons 24 and/or the un-worked field 22 as well as any change in shape. Note that the various mechanisms described above to distinguish classes may be used alone or in different combinations. In other words, each polygon 24 with associated different classes may be distinguished using one or any combination of different colors, shadings, fill patterns, icons, and/or alphanumeric (the latter invoked by the operator or persistently displayed). As the data collected by the machine 12 changes over the course of the working path, the data is assigned to different classes. Along the first working path (left side of FIGS. 2-3, bottom to top) the data is assigned to classes A, then B, then C (polygon 24C), then B (polygon 24D) and then A (polygon 24E). Classes B and C represent different estimated values or estimated average values or different ranges of estimated values, such as 1 1 1 -120 and 121 -130, respectively. Note that the values provided are merely examples, and other values and/or respective span in individual ranges may be used.
[0023] As the machine traverses the second working path 32 (e.g. , in the direction as shown by the arrow of the icon 26), additional data is processed to generate intermediate estimated values. The new estimated values are also placed into classes, and the visual areas (e.g. , polygons 24) associated with each class change shape (e.g. , expand), such as the large A area near the top corresponding to polygon 24E and polygon 24D corresponding to class B adjacent to the polygon 24E. As more of the field 22 is worked, the polygons 24 corresponding to the different classes may increase in size or, generally, change shape.
[0024] As more measurements are taken and more data points are gathered, original estimates may be updated. As data is collected for the second working path 32, for example, some or all of the estimates for the first row may be updated to reflect the newly-collected data. Thus, the polygons 24 in the first row may change in size or shape, may be eliminated entirely, and/or new areas may be created. As additional data points are collected, all or a portion of the estimated data points, pertaining to the entire field 22, may be recalculated.
[0025] The data classification information, such as an estimated value or range of estimated values corresponding to each class may be configured by the operator or may be automatically generated based on the estimated data. The parameter sensing feedback system 10 may generate multiple classes, for example, corresponding to uniform divisions between the lowest and highest estimated values. The class values, whether determined by the operator or automatically generated by the parameter sensing feedback system 10, may be visible on the map 20 itself (e.g., persistently displayed) or may be presented temporarily via a pop-up window or user interface tab when selected by the operator. The operator may adjust the granularity of the classes and areas at any time via a user interface element such as a button or knob. Furthermore, in some embodiments, rather than using classes, the parameter sensing feedback system 10 may present each estimated data point on the screen display 18.
[0026] The estimated data may be mapped to the area worked by the machine 12 and/or may be mapped to areas beyond the area worked by the machine 12. The parameter sensing feedback system 10, for example, may generate estimated data corresponding to a band of a designated or predetermined width around an outer edge of the worked area. Furthermore, the parameter sensing feedback system 10 may generate estimated data corresponding to a proximate row or rows.
[0027] Referring to FIG. 3, the map 20 is presented on the screen display 18 with the persistently labeled classes (A, B, C) for each of the polygons 24, where an indicator such as a pop-up icon 34 is invoked by the operator as shown for polygon 24C. In other words, the operator in this example seeks to learn of the estimated value or range of estimated values for class C, and in one embodiment, touches the screen 18 in a location over the polygon 24C to invoke the pop-up icon 34 (or invokes the pop-up icon 34 using other user interface methods, as previously described). As shown, the pop-up icon 34 replaces the label C (though in some embodiments, the label C may remain), and includes a range of estimated yield values for class C associated with polygon 24C, wherein the pop-up icon 34 is presented entirely within the polygon 24C. In some embodiments, only a portion of the pop-up icon 34 is overlapping the polygon 24C. Note that, as described above, other mechanisms for visually distinguishing and/or representing the associated class for each polygon 24 may be used in conjunction with the pop-up icon 34 in some embodiments, and in some embodiments, the pop-up icon 34 may be persistently displayed in lieu of any alphanumeric and/or other icons.
[0028] With continued reference to FIGS. 1-3, attention is now directed to FIG. 4A, which illustrates an embodiment of a parameter sensing feedback system 10. It should be appreciated within the context of the present disclosure that some embodiments may include additional components or fewer or different components, and that the example depicted in FIG. 4A is merely illustrative of one embodiment among others. Further, in some embodiments, the parameter sensing feedback system 10 may be distributed among plural machines. For instance, functionality of the parameter sensing feedback system 10 may be distributed among a towing machine and a towed machine, such as when sensing the chemical output (e.g., sensed parameter) of a towed implement. As another example, multiple machines may operate in the same field at the same time. Parameters may be sensed among the multiple machines, and shared via an ad hoc network among the multiple machines, or via communication over the network 16 to and from the computing system 14 (e.g., serving as a cloud system or Internet server for each machine). Parameter sensing feedback software in each machine may apply the geostatistical methods based on the data received from one or more of the machines to generate estimated data, in addition to the host machine, and render the dynamically changing polygons based on the estimated data onto the map of the respective screen display 18. In some embodiments, the estimated data (in lieu of the sensed data) may be communicated among the machines for rendering of polygons on each respective screen display 18. The parameter sensing feedback system 10 comprises one or more control units, such as the control unit 36. The control unit 36 is coupled via one or more networks, such as network 38 (e.g., a CAN network or other network, such as a network in conformance to the ISO 1 1783 standard, also referred to as "Isobus"), to a position indication component 40 (e.g., which may include one or more receivers that include the ability to access one or more constellations jointly or separately via a global navigation satellite system (GNSS), such as global positioning systems (GPS), GLONASS, Galileo, among other constellations), a user interface 42 (which in one embodiment includes the screen display 18), a network interface 44, and one or more sensors of a sensor system 46. Note that operations of the parameter sensing feedback system 10 are primarily disclosed herein in the context of control via a single control unit 36, with the understanding that additional control units 36 may be involved in one or more of the disclosed functionality.
[0029] In one embodiment, the position indication component 40 comprises a GNSS receiver that continually updates the control unit 36 with real time position information that indicates a current geographical position of the machine 12, which the control unit 36 compares to a map (e.g., geographical coordinates within a defined region from where the machine 12 is operating) to enable the presentation of machine operations onto the map. The user interface 42 may include one or more of a keyboard, mouse, microphone, touch-type display device, such as the screen display 18, joystick, steering wheel, or other devices (e.g., switches, immersive head set, etc.) that enable input and/or output by an operator (e.g., to prompt indicators onto a map). As noted above, the screen display 18 may be a component of the user interface 42.
[0030] The network interface 44 comprises hardware and/or software that enable wireless connection to the network 16 (FIG. 1 ). For instance, the network interface 44 may cooperate with browser software or other software of the control unit 36 to communicate with the computing system 14 (FIG. 1 ), such as via cellular links, among other telephony communication mechanisms and radio frequency communications. In some embodiments, the computing system 14 may host a cloud service, whereby all or a portion of the functionality of the control unit 36 resides on the computing system 14 and is accessed by the control unit 36 via the network interface 44. For instance, the computing system 14 may receive position information from the control unit 36 (via the network interface 44) and based on geographic information stored at, or in association with, the computing system 14, estimate data based on one or more sensed properties for communication back to the control unit 36 for display on the screen display 18. The network interface 44 may comprise MAC and PHY components (e.g., radio circuitry, including transceivers, antennas, etc.), as should be appreciated by one having ordinary skill in the art. In some embodiments, the network interface 44 may enable wireless communication (e.g., sharing of data) among multiple machines operating in the same field or generally within wireless range, such as in an ad hoc network. Multiple machines may also communicate with each other via the network interface 44 and the cloud hosted by the computing system 14.
[0031] The sensor system 46 may comprise one or more sensors of the machine 12 to sense machine and/or field parameters, such as grain yield, crop damage, moisture, fluid level, fuel level, chemical production, among other parameters for which an operator of the machine may wish to have a visual feedback. The sensors of the sensor system 46 may be embodied as contact (e.g., electromechanical sensors, such as position sensors, safety switches, etc.) and non-contact type sensors (e.g., photo-electric, inductive, capacitive, ultrasonic, etc.), all of which comprise known technology.
[0032] In one embodiment, the control unit 36 is configured to receive and process information from the network interface 44, the position indication component 40, the sensor system 46, and/or the user interface 42. For instance, the control unit 36 may receive input from the user interface 42 (e.g., screen display 18), such as to enable the operator to prompt (and/or acknowledge) a pop-up icon or other indicator presented in overlapping manner on polygons of a map. In some embodiments, the control unit 36 receives data from the sensor system 46 and processes the data using one or more geostatistical methods (e.g., kriging, inverse distance weighting, etc.) to generate (e.g., extrapolate or generally, fill-in) estimated data based on the data received from the sensor system 46. The control unit 36 presents the map and a plurality of polygons corresponding to the processed data on the map, while framing the periphery of each polygon with a border in some embodiments (though some embodiments may omit the border for all or a portion of the polygons) and presenting on each polygon, or at least partially overlapping each polygon, an indicator of the estimated value(s) in the form of an alphanumeric (e.g., estimated value, label, etc.) and/or icon (e.g., pop-up icon, symbolic icon, etc.). The indicator may be transitory in nature (e.g., invoked by an operator) or persistently displayed. The control unit 36 is also configured to cause the transmission of information (and/or enable the reception of information) via the network interface 44 for communication with the computing system 14, as set forth above.
[0033] FIG. 4B further illustrates an example embodiment of the control unit 36. One having ordinary skill in the art should appreciate in the context of the present disclosure that the example control unit 36 is merely illustrative, and that some embodiments of control units may comprise fewer or additional components, and/or some of the functionality associated with the various components depicted in FIG. 4B may be combined, or further distributed among additional modules, in some embodiments. It should be appreciated that, though described in the context of residing in the machine 12, in some embodiments, the control unit 36, or all or a portion of its corresponding functionality, may be implemented in a computing device or system (e.g., computing system 14) located external to the machine 12, or distributed among plural machines in some embodiments. Referring to FIG. 4B, with continued reference to FIG. 4A, the control unit 36 is depicted in this example as a computer system, but may be embodied as a programmable logic controller (PLC), field programmable gate array (FPGA), application specific integrated circuit (ASIC), among other devices. It should be appreciated that certain well-known components of computer systems are omitted here to avoid obfuscating relevant features of the control unit 36. In one embodiment, the control unit 36 comprises one or more processors (also referred to herein as processor units or processing units), such as processor 48, input/output (I/O) interface(s) 50, and memory 52, all coupled to one or more data busses, such as data bus 54. The memory 52 may include any one or a combination of volatile memory elements (e.g., random-access memory RAM, such as DRAM, and SRAM, etc.) and nonvolatile memory elements (e.g., ROM, hard drive, tape, CDROM, etc.). The memory 52 may store a native operating system, one or more native applications, emulation systems, or emulated applications for any of a variety of operating systems and/or emulated hardware platforms, emulated operating systems, etc.
[0034] In some embodiments, the memory 52 may store geographical information, such as one or more field maps (e.g., geographical coordinates of the entire field). The geographical information may include topographic feature of the fields in some embodiments. The field maps may be in the form of aerial imagery or recorded geographical coordinates of one or more fields, including recorded entry points, identified boundaries of the one or more fields, paths or waylines previously determined, customizations, and other data pertinent to farming. In some embodiments, the geographical information may be stored remotely (e.g., at the computing system 14), or stored in distributed manner (e.g., in memory 52 and remotely). In the embodiment depicted in FIG. 4B, the memory 52 comprises an operating system 56, and parameter sensing feedback (PSF) software 58. It should be appreciated that in some embodiments, additional or fewer software modules (e.g., combined functionality) may be deployed in the memory 52 or additional memory. In some embodiments, a separate storage device may be coupled to the data bus 54, such as a persistent memory (e.g., optical, magnetic, and/or semiconductor memory and associated drives).
[0035] The parameter sensing feedback software 58, as executed by the processor 48, is described below as performing certain functions, though it should be appreciated by one having ordinary skill in the art in the context of the present disclosure that some functions may be offloaded by other software and provided as input to the parameter sensing feedback software 58 in some embodiments. The parameter sensing feedback software 58 receives position information (e.g., from the position indication component 40) and compares the position information with geographic information stored locally or remotely and determines the position of the machine 12 in the field. The parameter sensing feedback software 58 also receives data corresponding to a sensed parameter from the sensor system 46. For instance, the data is sampled at defined times and/or positions of the machine 12 and provided to the parameter sensing feedback software 58. The parameter sensing feedback software 58 processes these samples (data) to produce a real time map with a set of contiguous classes for the sensed parameters. For instance, the parameter sensing feedback system 10 processes the data by applying geostatistical methods to process the sampled spatial data to be rendered in a contiguous real time map for the area covered by the machine 12 and/or coupled implement (e.g., header, towed implement, etc.). The geostatistical method or methods used may be selected and configured by an operator of the machine 12, or in some embodiments, automatically selected. Additional processing includes filtering the data, and/or other signal conditioning. Stated otherwise, new spatial data gathered by the sensor system 46 on, or associated with, the machine 12 is input to a geostatistical algorithm (or algorithms) of the parameter sensing feedback software 58 to update the real time map using a user interface (e.g., graphical user interface) component to the parameter sensing feedback software 58 or as communicated to a separate user interface software component. Updating includes adjusting the classification of the data over either an operator-configurable set of classes or an automatic classification. The area onto which the processed data is mapped (e.g., in the form of polygons with an optional border and associated indicator) can be synchronized with the worked area or configured to predict to a certain band around the worked area. The data classification values can be configured by the operator to be visible on the map itself (e.g. , persistent) or as a pop-up icon (e.g., temporary or transitory) as a detailed data information box when the operator touches an area of the map on the screen display 18 or otherwise invokes using other user interface mechanisms.
[0036] Execution of the parameter sensing feedback software 58 may be implemented by the processor 48 under the management and/or control of the operating system 56. In some embodiments, the operating system 56 may be omitted and a more rudimentary manner of control implemented. The processor 48 may be embodied as a custom-made or commercially available processor, a central processing unit (CPU) or an auxiliary processor among several processors, a semiconductor based microprocessor (in the form of a microchip), a macroprocessor, one or more application specific integrated circuits (ASICs), a plurality of suitably configured digital logic gates, and/or other well-known electrical configurations comprising discrete elements both individually and in various combinations to coordinate the overall operation of the control unit 36. [0037] The I/O interfaces 50 provide one or more interfaces to the network 38 and other networks. In other words, the I/O interfaces 50 may comprise any number of interfaces for the input and output of signals (e.g., analog or digital data) for conveyance of information (e.g., data) over the network 38. The input may comprise input by an operator (local or remote) through the user interface 42 and input from signals carrying information from one or more of the components of the parameter sensing feedback system 10, such as the position indication component 40, the sensor system 46, and/or the network interface 44, among other devices.
[0038] When certain embodiments of the control unit 36 are implemented at least in part with software (including firmware), as depicted in FIG. 4B, it should be noted that the software can be stored on a variety of non-transitory computer-readable medium for use by, or in connection with, a variety of computer-related systems or methods. In the context of this document, a computer-readable medium may comprise an electronic, magnetic, optical, or other physical device or apparatus that may contain or store a computer program (e.g., executable code or instructions) for use by or in connection with a computer-related system or method. The software may be embedded in a variety of computer-readable mediums for use by, or in connection with, an instruction execution system, apparatus, or device, such as a computer- based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions.
[0039] When certain embodiment of the control unit 36 are implemented at least in part with hardware, such functionality may be implemented with any or a combination of the following technologies, which are all well-known in the art: a discrete logic circuit(s) having logic gates for implementing logic functions upon data signals, an application specific integrated circuit (ASIC) having appropriate combinational logic gates, a programmable gate array(s) (PGA), a field programmable gate array (FPGA), etc. [0040] In view of the above description, it should be appreciated that one embodiment of a parameter sensing feedback method 60, depicted in FIG. 5, comprises receiving updated position information while traversing a field (62); receiving data corresponding to a sensed parameter (64); processing the data using a geostatistical method (66); and presenting concurrently on a map and in real time a dynamically changing area rendering and an indicator corresponding to respective detected positions in the field, the area rendering associated with the processed data, the indicator overlapping at least a portion of the area rendering and comprising an alphanumeric value corresponding to the processed data (68).
[0041] Any process descriptions or blocks in flow diagrams should be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process, and alternate implementations are included within the scope of the embodiments in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present disclosure.
[0042] In this description, references to "one embodiment", "an embodiment", or "embodiments" mean that the feature or features being referred to are included in at least one embodiment of the technology. Separate references to "one embodiment", "an embodiment", or "embodiments" in this description do not necessarily refer to the same embodiment and are also not mutually exclusive unless so stated and/or except as will be readily apparent to those skilled in the art from the description. For example, a feature, structure, act, etc. described in one embodiment may also be included in other embodiments, but is not necessarily included. Thus, the present technology can include a variety of combinations and/or integrations of the embodiments described herein. Although the control systems and methods have been described with reference to the example embodiments illustrated in the attached drawing figures, it is noted that equivalents may be employed and substitutions made herein without departing from the scope of the disclosure as protected by the following claims.

Claims

CLAIMS At least the following is claimed:
1. A parameter sensing feedback system for an agricultural machine, the system comprising:
a position indication component configured to generate position information that indicates a current geographic position of the machine;
a sensor system;
a screen display; and
a control unit configured to:
receive the position information from the position indication component; receive first data corresponding to a parameter detected by the sensor system;
process the first data using a geostatistical method;
associate the processed first data with a first class selected from among a plurality of classes;
present a map of a field corresponding to the position information on the screen display; and
present on the map:
a dynamically changing first area rendering, the first area rendering associated with the first class; and
a first alphanumeric value corresponding to the first class in an area overlapping the first area rendering.
2. The parameter sensing feedback system of claim 1 , wherein the control unit is further configured to present a border around a periphery of the first area rendering.
3. The parameter sensing feedback system of claim 1 , wherein the control unit is further configured to:
receive second data corresponding to the parameter detected by the sensor system;
process the second data using a geostatistical method; and
associate the processed second data with a second class selected from among the plurality of classes, wherein the control unit is further configured to present on the map:
a dynamically changing second area rendering adjacent the first area rendering, the second area rendering associated with the second class; and a second alphanumeric numeric value corresponding to the second class in an area overlapping the second area rendering, wherein the first alphanumeric value is different from the second alphanumeric value.
4. The parameter sensing feedback system of claim 3, wherein the control unit is further configured to present a border around a periphery of the second area rendering.
5. The parameter sensing feedback system of claim 1 , wherein the control unit is configured to present the first alphanumeric value by presenting the received first data or the processed first data.
6. The parameter sensing feedback system of claim 1 , wherein the control unit is configured to present the first alphanumeric value by presenting a range of values corresponding to the first class.
7. The parameter sensing feedback system of claim 1 , wherein the control unit is configured to revise a shape of the first area rendering responsive to a change in the position information.
8. The parameter sensing feedback system of claim 1 , wherein the control unit is configured to present the first alphanumeric value in response to a signal associated with operator input.
9. The parameter sensing feedback system of claim 8, wherein the signal associated with the operator input comprises a signal generated in response to sensing at the screen display a touch-screen entry at a screen coordinate encompassed by the first area rendering.
10. The parameter sensing feedback system of claim 1 , wherein the control unit is further configured to process the first data using plural geostatistical methods.
1 1 . The parameter sensing feedback system of claim 1 , wherein the position indication component, sensor system, screen display, and control unit are hosted by a first agricultural machine, further comprising a second agricultural machine working in proximity to the first agricultural machine, the second agricultural machine comprising a control unit, a sensor system, and a network interface, the control unit of the second agricultural machine configured to receive data corresponding to the parameter detected by the sensor system of the second agricultural machine, process the data using one or more geostatistical methods, and communicate the processed data over the network interface of the second agricultural machine, wherein the control unit of the first agricultural machine receives the processed data from the second agricultural machine and presents area renderings on the map based on the processed data of the first agricultural machine and the processed data received from the second agricultural machine.
12. The parameter sensing feedback system of claim 1 , wherein the control unit is further configured to process the data by predicting a class for data expected to be received via traversal of the machine and presenting the area rendering based at least in part on the predicted class.
13. A parameter sensing feedback system, comprising:
an agricultural machine, comprising:
a screen display;
a sensor system; and
a position indication component configured to generate position information that indicates a current geographic position of the machine; and a control unit configured to:
receive first data corresponding to a parameter detected by the sensor system;
process the first data using one or more geostatistical methods;
present a map on the screen display corresponding to the position information; and
present on the map a first area rendering and a first indicator, the first area rendering associated with the processed first data, the first indicator overlapping at least a portion of the first area rendering and comprising an alphanumeric value corresponding to the processed first data.
14. The parameter sensing feedback system of claim 13, wherein the control unit is configured to present on the map the first indicator by presenting a pop-up icon that includes the alphanumeric value.
15. The parameter sensing feedback system of claim 14, wherein the control unit is configured to present the pop-up icon responsive to operator intervention.
16. The parameter sensing feedback system of claim 13, wherein the control unit is further configured to: receive second data corresponding to the parameter detected by the sensor system;
process the second data using the one or more geostatistical methods; and
present on the map a second area rendering and a second indicator, the area rendering based on the processed second data and adjacent the first area rendering, the second indicator overlapping at last a portion of the second area rendering and comprising an alphanumeric value corresponding to the processed second data.
17. The parameter sensing feedback system of claim 16, wherein the control unit is further configured to present a border between the first area rendering and the second area rendering, the border visually demarcating a boundary between the first area rendering and the second area rendering, and wherein the control unit is further configured to present the first area rendering with a different color or pattern than the second area rendering.
18. The parameter sensing feedback system of claim 16, wherein the agricultural machine comprises an implement operatively coupled to the agricultural machine, wherein the first area rendering is associated by the control unit with a first portion of the implement and concurrently, the second area rendering is associated by the control unit with a second portion of the implement.
19. The parameter sensing feedback system of claim 13, further comprising a second agricultural machine working in proximity to the first agricultural machine, the second agricultural machine comprising a first control unit, a sensor system, and a network interface, the first control unit of the second agricultural machine configured to receive data corresponding to the parameter detected by the sensor system of the second agricultural machine and communicate the received data over the network interface of the second agricultural machine, wherein the control unit resides in the first agricultural machine, the control unit receiving the data from the second agricultural machine and processing the first data and the data received from the second agricultural machine using the one or more geostatistical methods, wherein the control unit of the first agricultural machine presents area renderings on the map based on the data received from the first and second agricultural machines.
20. A parameter sensing feedback method, comprising:
receiving updated position information while traversing a field;
receiving data corresponding to a sensed parameter; processing the data using a geostatistical method;
presenting concurrently on a map and in real time a dynamically changing area rendering and an indicator corresponding to respective detected positions in the field, the area rendering associated with the processed data, the indicator overlapping at least a portion of the area rendering and comprising an alphanumeric value corresponding to the processed data.
PCT/US2014/055160 2013-09-13 2014-09-11 Method to automatically estimate and classify spatial data for use on real time maps WO2015038751A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201361877314P 2013-09-13 2013-09-13
US61/877,314 2013-09-13
US201361917962P 2013-12-19 2013-12-19
US61/917,962 2013-12-19

Publications (1)

Publication Number Publication Date
WO2015038751A1 true WO2015038751A1 (en) 2015-03-19

Family

ID=51619311

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/055160 WO2015038751A1 (en) 2013-09-13 2014-09-11 Method to automatically estimate and classify spatial data for use on real time maps

Country Status (1)

Country Link
WO (1) WO2015038751A1 (en)

Cited By (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017172610A1 (en) * 2016-03-30 2017-10-05 Autonomous Solutions, Inc. Multiple harvester planner
WO2018070924A1 (en) * 2016-10-10 2018-04-19 Ålö AB An agriculture operation monitoring system and monitoring method
CN110622200A (en) * 2017-06-26 2019-12-27 株式会社久保田 Farmland map generation system
US11079725B2 (en) 2019-04-10 2021-08-03 Deere & Company Machine control using real-time model
US11178818B2 (en) 2018-10-26 2021-11-23 Deere & Company Harvesting machine control system with fill level processing based on yield data
US11234366B2 (en) 2019-04-10 2022-02-01 Deere & Company Image selection for machine control
US11240961B2 (en) 2018-10-26 2022-02-08 Deere & Company Controlling a harvesting machine based on a geo-spatial representation indicating where the harvesting machine is likely to reach capacity
US20220110251A1 (en) 2020-10-09 2022-04-14 Deere & Company Crop moisture map generation and control system
GB2601557A (en) * 2020-12-04 2022-06-08 Canon Res Centre France Generating characteristics data of an agricultural field adapted for precision farming
US11467605B2 (en) 2019-04-10 2022-10-11 Deere & Company Zonal machine control
US11474523B2 (en) 2020-10-09 2022-10-18 Deere & Company Machine control using a predictive speed map
US11477940B2 (en) 2020-03-26 2022-10-25 Deere & Company Mobile work machine control based on zone parameter modification
US11510365B2 (en) 2019-09-03 2022-11-29 Cnh Industrial America Llc Harvesting header segment display and map
EP3689125B1 (en) 2019-01-31 2022-11-30 CNH Industrial Belgium N.V. Combine loss monitor mapping
US11592822B2 (en) 2020-10-09 2023-02-28 Deere & Company Machine control using a predictive map
US11589509B2 (en) 2018-10-26 2023-02-28 Deere & Company Predictive machine characteristic map generation and control system
US11635765B2 (en) 2020-10-09 2023-04-25 Deere & Company Crop state map generation and control system
US11641800B2 (en) 2020-02-06 2023-05-09 Deere & Company Agricultural harvesting machine with pre-emergence weed detection and mitigation system
US11650587B2 (en) 2020-10-09 2023-05-16 Deere & Company Predictive power map generation and control system
US11653588B2 (en) 2018-10-26 2023-05-23 Deere & Company Yield map generation and control system
US11672203B2 (en) 2018-10-26 2023-06-13 Deere & Company Predictive map generation and control
US11675354B2 (en) 2020-10-09 2023-06-13 Deere & Company Machine control using a predictive map
US11711995B2 (en) 2020-10-09 2023-08-01 Deere & Company Machine control using a predictive map
US11727680B2 (en) 2020-10-09 2023-08-15 Deere & Company Predictive map generation based on seeding characteristics and control
US11778945B2 (en) 2019-04-10 2023-10-10 Deere & Company Machine control using real-time model
US11825768B2 (en) 2020-10-09 2023-11-28 Deere & Company Machine control using a predictive map
US11844311B2 (en) 2020-10-09 2023-12-19 Deere & Company Machine control using a predictive map
US11845449B2 (en) 2020-10-09 2023-12-19 Deere & Company Map generation and control system
US11849672B2 (en) 2020-10-09 2023-12-26 Deere & Company Machine control using a predictive map
US11849671B2 (en) 2020-10-09 2023-12-26 Deere & Company Crop state map generation and control system
US11864483B2 (en) 2020-10-09 2024-01-09 Deere & Company Predictive map generation and control system
US11874669B2 (en) 2020-10-09 2024-01-16 Deere & Company Map generation and control system
US11889788B2 (en) 2020-10-09 2024-02-06 Deere & Company Predictive biomass map generation and control
US11889787B2 (en) 2020-10-09 2024-02-06 Deere & Company Predictive speed map generation and control system
US11895948B2 (en) 2020-10-09 2024-02-13 Deere & Company Predictive map generation and control based on soil properties
US11927459B2 (en) 2020-10-09 2024-03-12 Deere & Company Machine control using a predictive map
US11946747B2 (en) 2020-10-09 2024-04-02 Deere & Company Crop constituent map generation and control system
US11957072B2 (en) 2020-02-06 2024-04-16 Deere & Company Pre-emergence weed detection and mitigation system
US11983009B2 (en) 2020-10-09 2024-05-14 Deere & Company Map generation and control system
US12013245B2 (en) 2020-10-09 2024-06-18 Deere & Company Predictive map generation and control system
US12035648B2 (en) 2020-02-06 2024-07-16 Deere & Company Predictive weed map generation and control system
US12058951B2 (en) 2022-04-08 2024-08-13 Deere & Company Predictive nutrient map and control
US12069978B2 (en) 2018-10-26 2024-08-27 Deere & Company Predictive environmental characteristic map generation and control system
US12069986B2 (en) 2020-10-09 2024-08-27 Deere & Company Map generation and control system
US12082531B2 (en) 2022-01-26 2024-09-10 Deere & Company Systems and methods for predicting material dynamics
US12127500B2 (en) 2021-01-27 2024-10-29 Deere & Company Machine control using a map with regime zones

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1998021929A1 (en) * 1996-11-22 1998-05-28 Case Corporation Scouting system for an agricultural field

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1998021929A1 (en) * 1996-11-22 1998-05-28 Case Corporation Scouting system for an agricultural field

Cited By (61)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10314224B2 (en) 2016-03-30 2019-06-11 Autonomous Solutions, Inc. Multiple harvester planner
WO2017172610A1 (en) * 2016-03-30 2017-10-05 Autonomous Solutions, Inc. Multiple harvester planner
US12073658B2 (en) 2016-10-10 2024-08-27 Ålö AB Agriculture operation monitoring system and monitoring method
WO2018070924A1 (en) * 2016-10-10 2018-04-19 Ålö AB An agriculture operation monitoring system and monitoring method
CN110622200A (en) * 2017-06-26 2019-12-27 株式会社久保田 Farmland map generation system
KR20200019848A (en) * 2017-06-26 2020-02-25 가부시끼 가이샤 구보다 Pavement map generation system
EP3648045A4 (en) * 2017-06-26 2021-03-24 Kubota Corporation Farm field map generation system
CN110622200B (en) * 2017-06-26 2023-10-27 株式会社久保田 Farmland map generation system
KR102593355B1 (en) * 2017-06-26 2023-10-25 가부시끼 가이샤 구보다 Pavement map generation system
US11589508B2 (en) 2017-06-26 2023-02-28 Kubota Corporation Field map generating system
US11672203B2 (en) 2018-10-26 2023-06-13 Deere & Company Predictive map generation and control
US12069978B2 (en) 2018-10-26 2024-08-27 Deere & Company Predictive environmental characteristic map generation and control system
US11589509B2 (en) 2018-10-26 2023-02-28 Deere & Company Predictive machine characteristic map generation and control system
US12010947B2 (en) 2018-10-26 2024-06-18 Deere & Company Predictive machine characteristic map generation and control system
US11240961B2 (en) 2018-10-26 2022-02-08 Deere & Company Controlling a harvesting machine based on a geo-spatial representation indicating where the harvesting machine is likely to reach capacity
US11178818B2 (en) 2018-10-26 2021-11-23 Deere & Company Harvesting machine control system with fill level processing based on yield data
US11653588B2 (en) 2018-10-26 2023-05-23 Deere & Company Yield map generation and control system
EP3689125B1 (en) 2019-01-31 2022-11-30 CNH Industrial Belgium N.V. Combine loss monitor mapping
US11079725B2 (en) 2019-04-10 2021-08-03 Deere & Company Machine control using real-time model
US11829112B2 (en) 2019-04-10 2023-11-28 Deere & Company Machine control using real-time model
US11467605B2 (en) 2019-04-10 2022-10-11 Deere & Company Zonal machine control
US11778945B2 (en) 2019-04-10 2023-10-10 Deere & Company Machine control using real-time model
US11234366B2 (en) 2019-04-10 2022-02-01 Deere & Company Image selection for machine control
US11650553B2 (en) 2019-04-10 2023-05-16 Deere & Company Machine control using real-time model
US11510365B2 (en) 2019-09-03 2022-11-29 Cnh Industrial America Llc Harvesting header segment display and map
US11641800B2 (en) 2020-02-06 2023-05-09 Deere & Company Agricultural harvesting machine with pre-emergence weed detection and mitigation system
US12035648B2 (en) 2020-02-06 2024-07-16 Deere & Company Predictive weed map generation and control system
US11957072B2 (en) 2020-02-06 2024-04-16 Deere & Company Pre-emergence weed detection and mitigation system
US11477940B2 (en) 2020-03-26 2022-10-25 Deere & Company Mobile work machine control based on zone parameter modification
US11825768B2 (en) 2020-10-09 2023-11-28 Deere & Company Machine control using a predictive map
US11895948B2 (en) 2020-10-09 2024-02-13 Deere & Company Predictive map generation and control based on soil properties
US11592822B2 (en) 2020-10-09 2023-02-28 Deere & Company Machine control using a predictive map
US12080062B2 (en) 2020-10-09 2024-09-03 Deere & Company Predictive map generation based on seeding characteristics and control
US11727680B2 (en) 2020-10-09 2023-08-15 Deere & Company Predictive map generation based on seeding characteristics and control
US11650587B2 (en) 2020-10-09 2023-05-16 Deere & Company Predictive power map generation and control system
US11844311B2 (en) 2020-10-09 2023-12-19 Deere & Company Machine control using a predictive map
US11845449B2 (en) 2020-10-09 2023-12-19 Deere & Company Map generation and control system
US11849672B2 (en) 2020-10-09 2023-12-26 Deere & Company Machine control using a predictive map
US11849671B2 (en) 2020-10-09 2023-12-26 Deere & Company Crop state map generation and control system
US11864483B2 (en) 2020-10-09 2024-01-09 Deere & Company Predictive map generation and control system
US11871697B2 (en) 2020-10-09 2024-01-16 Deere & Company Crop moisture map generation and control system
US11874669B2 (en) 2020-10-09 2024-01-16 Deere & Company Map generation and control system
US11889788B2 (en) 2020-10-09 2024-02-06 Deere & Company Predictive biomass map generation and control
US11889787B2 (en) 2020-10-09 2024-02-06 Deere & Company Predictive speed map generation and control system
US11635765B2 (en) 2020-10-09 2023-04-25 Deere & Company Crop state map generation and control system
US11927459B2 (en) 2020-10-09 2024-03-12 Deere & Company Machine control using a predictive map
US11946747B2 (en) 2020-10-09 2024-04-02 Deere & Company Crop constituent map generation and control system
US11711995B2 (en) 2020-10-09 2023-08-01 Deere & Company Machine control using a predictive map
US11983009B2 (en) 2020-10-09 2024-05-14 Deere & Company Map generation and control system
US12013698B2 (en) 2020-10-09 2024-06-18 Deere & Company Machine control using a predictive map
US11474523B2 (en) 2020-10-09 2022-10-18 Deere & Company Machine control using a predictive speed map
US12013245B2 (en) 2020-10-09 2024-06-18 Deere & Company Predictive map generation and control system
US11675354B2 (en) 2020-10-09 2023-06-13 Deere & Company Machine control using a predictive map
US12048271B2 (en) 2020-10-09 2024-07-30 Deere &Company Crop moisture map generation and control system
US20220110251A1 (en) 2020-10-09 2022-04-14 Deere & Company Crop moisture map generation and control system
US12069986B2 (en) 2020-10-09 2024-08-27 Deere & Company Map generation and control system
GB2601557A (en) * 2020-12-04 2022-06-08 Canon Res Centre France Generating characteristics data of an agricultural field adapted for precision farming
GB2601557B (en) * 2020-12-04 2023-02-22 Canon Res Centre France Generating characteristics data of an agricultural field adapted for precision farming
US12127500B2 (en) 2021-01-27 2024-10-29 Deere & Company Machine control using a map with regime zones
US12082531B2 (en) 2022-01-26 2024-09-10 Deere & Company Systems and methods for predicting material dynamics
US12058951B2 (en) 2022-04-08 2024-08-13 Deere & Company Predictive nutrient map and control

Similar Documents

Publication Publication Date Title
WO2015038751A1 (en) Method to automatically estimate and classify spatial data for use on real time maps
US20200126166A1 (en) Agricultural implement and implement operator monitoring apparatus, systems, and methods
EP3662730B1 (en) Machine control through active ground terrain mapping
US11995591B2 (en) Computer platform for controlling agricultural assets
US10209235B2 (en) Sensing and surfacing of crop loss data
US10315655B2 (en) Vehicle control based on soil compaction
AU2020289735A1 (en) Generating an agriculture prescription
US10120543B2 (en) Plant emergence system
US10317260B2 (en) Yield data calibration methods
US9961833B2 (en) Crop density map using row sensors
EP3991553B1 (en) Diagnostic system visualization and control for an agricultural spraying machine
US20200072809A1 (en) Agricultural machine with resonance vibration response detection
CA3113479A1 (en) Mobile work machine control based on control zone map data
CA3158773A1 (en) Detecting and generating a rendering of fill level and distribution of material in receiving vehicle(s)
JP7478066B2 (en) Work management system, work management method, and work management program
EP4134902A1 (en) Computer-implemented method
US20230048683A1 (en) Obtaining and augmenting agricultural data and generating an augmented display showing anomalies
EP4135355A1 (en) Computing system, agricultural system with such and method for controlling an agricultural system
BR102022013740A2 (en) OBTAINING AND AUGMENTING AGRICULTURAL DATA AND GENERATING AN AUGMENTED DISPLAY
JP2022036522A (en) Work management system, work management method, and work management program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14772533

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14772533

Country of ref document: EP

Kind code of ref document: A1