Nothing Special   »   [go: up one dir, main page]

EP3542235B1 - Commande interactive d'une machine dotée d'un retour d'information sur un paramètre de réglage - Google Patents

Commande interactive d'une machine dotée d'un retour d'information sur un paramètre de réglage Download PDF

Info

Publication number
EP3542235B1
EP3542235B1 EP17803792.5A EP17803792A EP3542235B1 EP 3542235 B1 EP3542235 B1 EP 3542235B1 EP 17803792 A EP17803792 A EP 17803792A EP 3542235 B1 EP3542235 B1 EP 3542235B1
Authority
EP
European Patent Office
Prior art keywords
input unit
machine
parameter
regulating
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP17803792.5A
Other languages
German (de)
English (en)
Other versions
EP3542235C0 (fr
EP3542235A1 (fr
Inventor
Eberhard DUFFNER
Werner FAULHABER
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Arburg GmbH and Co KG
Original Assignee
Arburg GmbH and Co KG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Arburg GmbH and Co KG filed Critical Arburg GmbH and Co KG
Publication of EP3542235A1 publication Critical patent/EP3542235A1/fr
Application granted granted Critical
Publication of EP3542235C0 publication Critical patent/EP3542235C0/fr
Publication of EP3542235B1 publication Critical patent/EP3542235B1/fr
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C45/00Injection moulding, i.e. forcing the required volume of moulding material through a nozzle into a closed mould; Apparatus therefor
    • B29C45/17Component parts, details or accessories; Auxiliary operations
    • B29C45/76Measuring, controlling or regulating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/409Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by using manual data input [MDI] or by using control panel, e.g. controlling functions with the panel; characterised by control panel details or by setting parameters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C45/00Injection moulding, i.e. forcing the required volume of moulding material through a nozzle into a closed mould; Apparatus therefor
    • B29C45/17Component parts, details or accessories; Auxiliary operations
    • B29C45/76Measuring, controlling or regulating
    • B29C2045/7606Controlling or regulating the display unit
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/35Nc in input of data, input till input file format
    • G05B2219/35512Display entered, measured values with bargraph
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/36Nc in input of data, input key till input tape
    • G05B2219/36133MMI, HMI: man machine interface, communication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/033Indexing scheme relating to G06F3/033
    • G06F2203/0339Touch strips, e.g. orthogonal touch strips to control cursor movement or scrolling; single touch strip to adjust parameter or to implement a row of soft keys

Definitions

  • the invention relates to a human-machine interface and a method for interactively controlling a control function of a machine through manual control input.
  • Human-machine interfaces for the interactive control of machines are known, for example, for teaching and/or running in an injection molding machine, in which a series of manual adjustment processes are required.
  • the corresponding machines can be connected to and/or have a human-machine interface. It is known to use hard-wired switching elements and/or keypads, keyboards, manipulators and/or touch-sensitive screens for such tasks.
  • the human-machine interface can be connected to the machine via a corresponding data processing unit or a machine control and can enable complex setting processes via this connection.
  • a control device for a plastic processing machine with a multi-touch functionality designed as a human-machine interface is known. Machine processes and production parameters can be entered, displayed and changed by an operator on this control unit.
  • the WO 2007/025396 A1 shows the assignment of two control panels to each other, the visualization of the button functions is implemented using graphic symbols, whereby the buttons can be dynamically assigned. Machine processes and production parameters are changed with screen support, using visible and tactile touchscreen buttons.
  • the +/- buttons listed there work incrementally (comparable to a volume control on a hi-fi device). By pressing the +/- button a speed is increased or reduced incrementally.
  • the DE 10 2005 052 725 B3 shows a control element for an injection molding machine with a screen for displaying program data, which is partly designed as a touchscreen.
  • Virtual function keys can be provided off-screen as well as on-screen. For this purpose, the screen protrudes above the touchscreen.
  • the DE 601 13 685 T2 reveals an interface for machine control with configurable keys, i.e. soft keys, in areas parallel to the screen.
  • the US 2004/0021698A1 proposes a graphical interface for a multifunctional device with a touch-sensitive display on which a hierarchy of graphical objects for multifunctional operation can be displayed to a user.
  • the WO 2012/155167 A2 discloses a human-machine interface and a method for manually controlling movements of an electronically controlled machine or system.
  • a proportional or quasi-analog control input means is used, which is designed as a touch-sensitive position detection sensor with at least one-dimensional resolution, in particular in the form of a touchpad or touch screen.
  • the touch-sensitive position detection sensor is used for the continuous detection of a sweeping actuation movement or of at least one individual actuation position within an electronically evaluable actuation surface of the touch-sensitive position detection sensor.
  • a temporal sequence of default or position values is determined in relation to its actuation surface and converted into a corresponding temporal sequence of setpoint values for the drive control.
  • the present invention is based on the task of improving interactive control of a complex machine with regard to its ergonomics, safety and intuitive operability.
  • the human-machine interface has a control input unit and a parameter input unit that can be operated independently of this.
  • a control function of a machine can be controlled manually depending on a control input.
  • An adjusting function can in particular mean setting a parameter or preferably a
  • the movement of a movable component that actually takes place on the machine can be understood.
  • the setting function can be controlled manually, simultaneously and interactively by means of a corresponding manual control input on the control input unit. This can be done in particular bidirectionally, which can correspond, for example, to moving the movable component forwards and backwards. Such a movement can basically take place at different speeds, in which case the speed can be influenced as a parameter with the parameter input unit.
  • Comparatively high actuating speeds may be desirable in order to be able to complete or achieve the corresponding actuating function as quickly as possible.
  • lower actuating speeds may also be desirable.
  • parameters are mentioned in this application, this refers not only to the speed of a movement, but to the parameters that are entered to operate an injection molding machine. These are pressure, temperature, times such as switching times between injection and holding pressure phase, times for operating the ejector, partial cycle times, e.g. in multi-component injection, path information for the mold clamping unit, mold height, dosing and injection quantity, volume flow, quantity of components that can be mixed and much more is familiar to a person skilled in the art on an injection molding machine.
  • the preferred area of application of the invention is then in the area in which interdependent parameters have to be entered.
  • An operator of the machine can also use the parameter input unit to control the control parameters manually at the same time as controlling the control function.
  • the parameter display unit which is arranged together with the parameter input unit on the human-machine interface, makes it possible for the machine operator to always be aware of the control parameters have.
  • the manually controlled control parameter can be reported back as a setpoint value or as an actual value that actually occurs on the machine using the parameter display unit.
  • the control parameter is functionally dependent on the control function.
  • This can improve the safety, ergonomics and intuitive operability of the machine, as the machine operator receives direct feedback, especially exactly where the actuating speed can be controlled manually, i.e. at the speed input unit, to stay with the speed example.
  • Arranged together can be understood to mean an adjacent, one above the other, next to one another and/or superimposed arrangement and/or a spatial assignment.
  • the feedback of the control parameter such as a speed can be done in various ways, in particular by clarifying a setpoint value, i.e. the manually entered parameter or an actual value of the parameter that actually occurs on the machine, in particular in the form of a control parameter variable that is dependent on this.
  • the direct feedback of the control parameter such as a control speed can be visual, haptic and/or tactile, in particular in the form of a scale display.
  • the parameter display unit has a bargraph display aligned along an elongated touch-sensitive surface of the parameter input unit.
  • a bargraph display can be understood as a flat scale display, in particular a graphical representation of a bar or bar chart.
  • the bargraph display has a variable, curved, in particular round, extent.
  • the human-machine interface is equipped with a touch-sensitive screen that can be operated independently of the input units described.
  • the input units described are preferably arranged at the edge of the screen.
  • a touch-sensitive surface of the screen covers not only its display unit but also the other input units.
  • the touch-sensitive screen As a so-called multi-touch display, it is possible to process a large number of inputs in the form of touches at different points on the screen at the same time.
  • further functions of the human-machine interface can be implemented, in particular graphical illustration a process, such as a process to be taught by manual control, the display of a keyboard for entering data and/or the like.
  • the independently operable input units in combination with the touch-sensitive screen enable safe, intuitive operation without having to forego the flexibility that the touch-sensitive screen offers.
  • control input unit and parameter input unit can be connected directly to a machine control of the machine by means of a particularly serial real-time interface.
  • This allows the machine to be controlled manually without, or at least without, any significant time delay.
  • Any programming processes and/or computationally intensive learning processes can, if necessary, be carried out using the touch-sensitive screen and/or via a corresponding computing unit, for example in the machine control and/or at the human-machine interface.
  • manual control can be carried out directly and without any time delay via the real-time interface. This enables parameterization in real time.
  • the bargraph display can be implemented passively or actively with luminous segments, in particular as pixels, as a band display with a large number of luminous segments and/or in another way.
  • the bargraph display preferably has a large number of light segments such as light-emitting diodes that can be individually controlled in linear succession.
  • the touch-sensitive elongated surface is arranged along the bargraph display. This makes it possible for the bargraph display to follow a required operator input during a setting process on the touch-sensitive surface. In particular, a sweep of the touch-sensitive surface using a part of the body can be followed directly by a corresponding lengthening or shortening of a bar on the bargraph display, and thus the entered or selected positioning speed can be simultaneously, visually reported back.
  • a modulable light display is arranged at least on the side or on both sides of the bargraph display, by means of which a status of the machine can be displayed.
  • Modulable can be understood as meaning that the display can be changed in terms of a color, a flashing pattern, a flashing frequency, a brightness and/or any other characteristic for indicating the state of the machine.
  • at least the colors green for OK and red for critical can be used, if necessary any number of intermediate tones to symbolize a state between OK and critical.
  • the parameter display unit is also modulated, whereby the large number of light segments of the bargraph display can be modulated in the same way as the light display.
  • the bargraph display and/or the light display are conical, with the light display preferably extending in a Y-shape over the human-machine interface.
  • the Y-shaped extension of the illuminated display can be understood to mean that it extends from a corner of the human-machine interface on both sides of the bargraph display and then runs at an angle parallel to two sides of the human-machine interface starting from the corner. This means that the illuminated display is always visible whenever a display area of the human-machine interface is viewed.
  • the light display cannot be completely covered by body parts, such as a hand or an arm of an operator, during any operating actions, and is therefore always visible even then. This can improve the detection of the condition and increase safety during operation.
  • a further alternative to the human-machine interface provides for it to have an independently operable selection input unit.
  • the control function can be selected by a manual operation. Due to the independent operability of the selection input unit, the operator of the machine is always aware of which control functions he has selected from a large number of control functions. It is particularly preferred that the selection of the control function is also reported back. In particular directly on the selection input unit and/or additionally also on the position input unit.
  • the parameter display unit and/or the parameter input unit are preferably arranged between an input unit, in particular for permanently assigned keys, and the control input unit. This central arrangement makes it particularly easy to enter the control parameter and monitor its feedback.
  • the input unit and/or the parameter input unit can have flush-mounted glass cylinders as haptically perceptible operating elements.
  • Glass can be understood as any type of transparent material, including acrylic glass. Due to the flush arrangement, a surface of the human-machine interface can be cleaned well, while tactile feedback is still possible, in particular through the tactility of the glass cylinders.
  • the operating elements can have a usual or usual force travel characteristic of typical keys or buttons.
  • the embedded glass cylinders can preferably be part of a multi-touch input device, similar to a protective glass on a touch-sensitive screen, i.e. be designed as sensor keys and still enable the desirable haptic or tactile feedback.
  • operating elements of the control input unit can also be dynamically assigned, in particular marked as required after a corresponding selection on the selection input unit or possibly also on the input unit. They can, for example, be shown on an additional display as a corresponding symbol. This allows the best possible compromise to be found between static and dynamic use of the controls. Operation is carried out consciously and via statically arranged and statically assigned control elements. The subsequent actual control of the actuating function can be carried out by means of control elements, which are also statically arranged but can be assigned dynamically with the respective actuating function. Furthermore, it is also possible to arrange the parameter setting unit statically on the human-machine interface, i.e. always in the same location. This means that the other critical variable, namely the control parameters such as the control speed, can always be consciously and safely controlled manually.
  • the input unit can also be connected directly to the machine control of the machine by means of the particularly serial real-time interface. This allows the machine to be controlled manually without, or at least without, any significant time delay. Any programming processes and/or computationally intensive learning processes can optionally be carried out using the touch-sensitive screen and/or via a corresponding computing unit, for example in the machine control and/or at the human-machine interface. Regardless of this, manual control can be carried out directly and without any time delay via the real-time interface. This enables parameterization in real time.
  • the method for interactively controlling a control parameter of a machine is carried out in particular using a previously described human-machine interface.
  • a touch-sensitive screen and the position input unit are provided.
  • This has in particular the haptically perceptible control elements, in particular with fixed pictograms and/or symbols, for example in the form of arrows and/or plus, minus signs.
  • the operating elements can be provided in pairs.
  • the control input unit is used to record a manual input for manually controlling the control function of the machine.
  • the parameter input unit is provided, by means of which a further manual parameter input is recorded simultaneously for manually controlling a parameter that is functionally dependent on the control function.
  • a parameter display unit is provided based on the human-machine interface.
  • the control input unit and the parameter input unit are connected to a machine control unit of the machine using a real-time interface.
  • the control function and the control parameter are controlled directly, i.e. simultaneously, and a setpoint value or an actual value of the control parameters of the control function that actually occurs on the machine is reported back by means of a bargraph display aligned along an elongated touch-sensitive surface of the parameter display unit provided.
  • the set control parameter of the control function or one that is actually found on the machine can be reported back.
  • the simultaneous or immediate feedback for example of an adjustment speed or another parameter, enables intuitive operation and increases operating safety, for example to avoid an adjustment parameter that is accidentally selected too high.
  • the touch-sensitive screen is operated independently of the parameter input unit and the position input unit. It is possible for the selected control function to be visualized using the screen, in particular in a spatial reference or close range to the physical control element of the control input unit. This makes it possible to assign this dynamically, i.e. depending on the selection input.
  • the operator of the human-machine interface or the machine check at any time which control function it is currently executing. Accordingly, the manual control input is recorded using the control element on which the control function is visualized.
  • the visualization can, for example, take place directly next to the corresponding control element on the screen.
  • control input unit and the parameter input unit are connected to a machine control of the machine by means of a real-time interface in such a way that immediate feedback between the control and the input unit with regard to the setpoint and actual values is possible. This makes it easier to operate the control and thus the machine intuitively, not only when parameterizing but also when operating the machine.
  • the input unit is also connected to the machine control of the machine by means of a real-time interface in such a way that immediate feedback between the control and input unit with regard to the setpoint and actual values is possible.
  • the method involves providing a selection input unit, by means of which a manual selection input for selecting the control function is recorded. Only after the manual selection input is the position input unit, in particular one or a paired control element of the position input unit, activated for detecting the manual position input.
  • the control input is therefore recorded depending on the selected selection input, namely only if the control function was previously selected from a large number of control functions.
  • the input unit or individual operating elements of the input unit can be blocked, so that blocked, unavailable control functions cannot be selected and subsequently not set.
  • the actuating functions can be provided with a limit switch function, so that overrunning an end stop can be reliably prevented despite a corresponding input on the actuating input unit. This can prevent unwanted damage to the machine.
  • a corresponding warning message can be displayed on the human-machine interface.
  • the activation and/or the selected control function is preferably reported back to the control input unit and/or to the input unit.
  • the activation can be confirmed, for example, by backlighting the correspondingly operated control element of the input unit with a greater light intensity. Alternatively or additionally, this can also be done on the position input unit, in particular on the then available or dynamically assigned control element of the control input unit. It is also possible for the control function to be symbolized on or on the control element of the control input unit, for example by means of a corresponding function pictogram or symbol.
  • the physical control element of the control input unit is dynamically assigned depending on the manual selection input and the manual control input is detected by means of the physical control element of the control input unit.
  • the selected control functions are preferably symbolized on or next to the dynamically assigned control element of the control input unit, in particular only if a corresponding selection input was previously made on the selection input unit. In particular, by pressing the selection input unit again, the dynamic assignment of the control element of the control input unit can be reset, i.e. blocked again.
  • the task is also achieved by a machine, in particular an injection molding machine, with a previously described human-machine interface and/or set up, designed and/or programmed to carry out a previously described method.
  • the human-machine interface 1 is used to interactively control a machine that is not shown in detail and is only symbolized by reference number 3.
  • the machine 3 is in particular an injection molding machine for processing plastics and other plasticizable materials such as powdery or ceramic masses.
  • the injection molding machine has in particular at least one plasticizing unit and a mold closing unit and preferably a large number of movable and adjustable components, the position of which can be manually controlled or adjusted by a corresponding operator input on the human-machine interface 1.
  • the human-machine interface 1 has an adjustment input unit 5 arranged on a right edge 31 and an input unit 17 arranged on a lower edge 33.
  • a selection input unit 42 is formed on the screen itself using assignable keys.
  • Control input unit 5 and input unit 17 are arranged via a corner 35 on the edges 31 and 33 of the human-machine interface 1. They are arranged adjacent to one another at the corner 35, similar to a miter fit, with a free space remaining between them.
  • a parameter input unit 7 is arranged in the free space remaining between the control input unit 5 and the input unit 17.
  • the parameter input unit 7 has a surface 11 that fills the free space remaining between the input unit 17 and the control input unit 5.
  • the surface 11 is designed as a touch-sensitive surface, i.e. it reacts to touches by an operator of the human-machine interface 1.
  • the surface 11 is therefore part of the parameter input unit 7, with sweeping over the strip-shaped surface 11 allowing intuitive setting, similar to setting a slider, of an actuating speed an adjusting function of the machine 3 enables.
  • a parameter is understood to mean the parameters that can be entered to operate an injection molding machine. These are pressure, speed of a movement, temperature, times such as switching times between injection and holding pressure phase, times for the operation of the ejector, partial cycle times, e.g. in multi-component injection, path information for the mold clamping unit, mold height, dosing and injection quantity, volume flow, quantity of components that can be mixed, and There is much more that a specialist is familiar with on an injection molding machine.
  • the preferred area of application of the invention is therefore in the area in which interdependent parameters have to be entered.
  • the human-machine interface 1 also has a parameter display unit 9 in the free space and with an elongated extension or in a strip shape.
  • the parameter input unit 7 and the parameter display unit 9 are arranged together, primarily one above the other between the control input unit 5 and the input unit 17. For example, the speed of a movement of the components of an injection molding machine can be entered and displayed via them.
  • the parameter display unit 9 has a bargraph display 13, preferably with a large number of light segments, which are in Fig. 2 are shown as examples. Depending on how many segments of the bargraph display 13 light up, a higher or lower adjustment speed of the adjustment function can be symbolized.
  • the bargraph display 13 follows a sweep of the surface 11 of the parameter input unit 7.
  • control parameter that actually occurs on the machine 3 is reported back by means of the bargraph display 13.
  • the feedback of the control parameters advantageously takes place simultaneously with touching the surface 11 and/or the actually existing control parameter, for example the actual control process of an control speed as the control parameter.
  • a light display 15 is arranged on both sides of the parameter display unit 9, in particular on both sides of the bargraph display 13.
  • the light display 15 extends from the corner 35 along the longitudinal extent of the free space or the parameter display unit 9 arranged therein and spreads out in a Y-shape at an inner end of the parameter display unit 9. In particular, it extends in a Y-shape parallel along the input unit 17 and the control input unit 5.
  • feedback on the state of the machine 3 can be provided simultaneously with the interactive control of the machine 3. This can be done by modulating the light display, in particular by changing a color of the light display 15. This means that an operator of the human-machine interface 1 can always be informed about the status of the machine 3, in particular with regard to the currently manually controlled actuating function.
  • the human-machine interface 1 also has one in the Fig. 1a and 2 visible touch-sensitive screen 23.
  • the control input unit 5 and the input unit 17 are arranged adjacent to the screen 23 or along the sides.
  • the screen 23 can be designed as a so-called touchscreen, in particular multi-touch capable, i.e. resolving several touch points at the same time.
  • a touch-sensitive surface 37 of the screen 23 extends over the control input unit 5 and the input unit 17.
  • the surface 11 can be provided for operating the parameter input unit 7.
  • the surface 11 can also have a separate device for detecting touches and the position input unit 5 and/or the input unit 17 can be conventional or separate keys.
  • Both the input unit 17 and the control input unit 5 have a large number of physically fixed control elements 21. These preferably have transparent cylinders, preferably glass cylinders 19.
  • the glass cylinders can be made of any transparent material such as glass, acrylic glass, another transparent plastic and/or the like.
  • the surface 37 and the glass cylinders 19 preferably have identical materials.
  • the operating elements 21 of the input unit 17 have different pictograms 39 to symbolize a selectable setting function of the machine 3 and are preferably permanently assigned to these functions. For example, this is a pictogram to symbolize a screw movement of a plasticizing unit of the machine 3. Pressing or actuating the corresponding control element 21 results in two control elements 21 of the control input unit 5 arranged next to one another being activated for the corresponding control function, i.e. adjustment of the screw become.
  • a symbol 41 appears on the touch-sensitive screen next to the control elements 21 of the control input unit 5, such as a symbol divided into two, which illustrates a functional assignment of the control elements 21, for example directions of the screw movement.
  • the description of the screw movement is an example and can be used analogously for all possible control functions of the machine 3, in the exemplary embodiment an injection molding machine.
  • control elements 21 of the control input unit 5 are dynamically assigned as described above, i.e. optionally marked with one of the symbols 41 or after renewed actuation and/or after time has elapsed and/or according to another condition for assignment with one further control function released again.
  • a selection input 29 is first made on one of the control elements 21 of the selection input unit 42.
  • An input can also be made via one of the control elements 21 of the input unit 17 that is permanently assigned a function.
  • two control elements 21 of the control input unit 5 are activated, i.e. assigned dynamically and marked with the symbol 41 for the selected control function on the touch-sensitive screen 23.
  • the adjustment function can be controlled manually by an adjustment input 25 on the two control elements 21 of the adjustment input unit 5, in particular bidirectionally, as preferably symbolized by arrow symbols on the control elements 21.
  • the control speed of the correspondingly selected control function for example the screw movement
  • the control speed of the correspondingly selected control function can be set simultaneously by sweeping over the surface 11 on the parameter input unit 7.
  • the description of the speed of the screw movement as a parameter is exemplary and can be used analogously for all possible control functions of parameters on the machine 3.
  • the positioning speed entered or selected in this way is also displayed simultaneously using the parameter display unit 9.
  • the input The control speed is determined by means of a parameter input 27, in particular a bidirectional sweep over the surface 11.
  • the inputs 25 to 29 are in Fig. 2 symbolized by arrows.
  • the human-machine interface 1 has a fastening console 43.
  • the fastening console 43 has a fastening surface 45 arranged at an acute angle to the surface 37 of the screen 23. This makes it possible to attach the human-machine interface 1 to a vertical wall of the machine 3 or to a stand on the machine in such a way that the screen 23 is inclined relative to the vertical. As a result, the ergonomics, in particular reading and/or input on the human-machine interface 1, can be improved. In particular, viewing and reading the human-machine interface 1 from diagonally above can be made easier.
  • the fastening surface 45 has holes 47 for attachment and an opening 49 for contact with the machine 3.
  • control input unit 5 and/or input unit 17 are at least slightly inclined towards the surface 37 of the screen 23, which is particularly good for the detail of the Fig. 4b can be seen.
  • Surface 37 and the input units 17 and/or 5 are arranged at an obtuse angle 51 to one another. Seen in a viewing direction 53, which is in 4a and 4b is symbolized by an arrow, the input units 5 and/or 17 are inclined away from the operator of the human-machine interface 1.
  • the control input unit 5 and/or input unit 17 are inclined away from the operator, in particular at an obtuse angle, for example between 180 and 150 degrees, preferably by 179 to 175 degrees.
  • this inclination enables a comfortable hand position when operating the laterally arranged input units.
  • possible incorrect operation when using the touch-sensitive screen, for example by an operator's arm, can be avoided, since this is tilted away from the operator's arm and is therefore not accidentally touched by the operator.
  • the human-machine interface 1 provides the control input unit 5, the parameter input unit 7 and the parameter display unit 9.
  • the manual control input 25 is recorded for manually controlling the control function of the machine 3.
  • the parameter input 27 is recorded simultaneously, with the control parameters, such as an control speed, also being visually perceptible to the operator simultaneously by means of the parameter display unit 9 the human-machine interface 1 is reported back.
  • the manual selection input 29 for selecting the control function is preferably recorded beforehand on the selection input unit 42 provided.
  • the control input unit 5 is activated in response to or depending on the selection input 29. Only then can the manual input 25 be recorded.
  • control element 21 of the input unit 17 can be backlit with maximum brightness after operation or the selection input unit 42.
  • Selectable controls 21 or control functions that can be controlled with them can be backlit with a medium brightness. Control functions that cannot be selected, i.e. that are not currently available on the machine 3, cannot be backlit or can only be backlit with weaker backlighting, for example on the input unit 17.
  • the activation and/or the selected control function is also reported back to the control input unit 5. This may also be done by means of backlighting, in particular by increasing the brightness of the corresponding control elements 21 and/or by means of the symbol 41 shown on the screen 23
  • the selected control function can be visualized using the symbol 41. Accordingly, the manual position input can be recorded using the corresponding operating elements 21 of the position input unit 5.
  • the control elements 21 of the control input unit 5 are therefore dynamically assigned depending on the selection input 29 on the control input unit 5.
  • the light display 15 and/or the parameter display unit 9 have a Y-shape that tapers towards the corner 35, which supports or enables intuitive operation or setting of the control parameters.
  • the parameter input unit 7 forms a slider for setting parameters such as pressure, temperature, times such as switching times between injection and holding pressure phase, times for the operation of the ejector, partial cycle times, for example in multi-component spraying, path information for the mold clamping unit, mold height, dosing and injection quantity, volume flow, Amount of components that can be mixed and much more, in particular for setting and / or reducing axis speeds of the actuating function or the machine 3.
  • a parameter value to be set can be adjusted almost continuously or in many individual steps by simply pushing or sliding a finger over the Surface 11 is preferably designed as a glass surface.
  • the currently set parameter value for example the currently set positioning speed
  • the bargraph display 13 in particular has a large number of individual LEDs, visualized, in particular in a distinctive arrow shape, to support intuitive detection of the control parameter.
  • the current state of the machine 3 can always be displayed, in particular using appropriate colors.
  • the operating elements 21 are advantageously designed to be tactilely discoverable, in particular by means of glass cylinders 19.
  • the ejector forward and ejector back adjusting functions can be implemented as soon as they are activated after a corresponding selection input 29.
  • the corresponding trigger or button is the glass cylinder 19, which is preferably inserted tightly and flush into a pane of the human-machine interface 1, with both a key stroke and a force travel characteristic of a typical key or a button being able to be reproduced when actuated.
  • the flush, tight installation and the choice of glass material for the key cylinder, i.e. the glass cylinder 19, make it possible to easily clean the control input unit 5, the input unit 17 and/or the entire human-machine interface 1.
  • the operating elements 21, in particular their glass cylinders 19, preferably contain one of the pictograms 39 and are preferably backlit in several brightness levels, for example to symbolize a selection state and/or a release for operation.
  • control elements 21 that can be operated for user guidance can be illuminated, while those that cannot currently be operated are darkened.
  • the control input unit 5 which is preferably also designed as a keyboard bar, has freely assignable control elements 21 or keys.
  • the operating elements 21 in particular have arrow symbols that can symbolize a direction of the control parameters that can be controlled manually.
  • the symbols 41 can be displayed on the screen 23 to display the respective function of the in direction of the Fig. 2 Seen to the right, the controls 21 are shown.
  • the touch-sensitive screen 23 is designed in particular as a high-resolution multi-touch screen. If necessary, the respective symbol 41 is divided into two parts in order to additionally symbolize the directions of the actuating function.
  • control input unit 5 and/or the input unit 17 are connected directly to a machine control of the machine 3 by means of a real-time interface (not shown in detail in the figures).
  • a real-time interface (not shown in detail in the figures).
  • the control input unit and/or input unit therefore does not act on the HMI PC, but is connected directly to the machine control via a clock-synchronous, serial interface for effect. This enables immediate feedback between the setpoint and actual value via the parameter display unit 9 and parameterization in real time.
  • the complete human-machine interface 1 or the surface 37 of the screen 23 can be attached at an angle to the machine 3 or on a stand (not shown in the drawing), in particular at an angle between 10 and 20 degrees, in particular 15 degrees to the vertical.
  • This allows the ergonomics of computer workstations to be optimized.
  • the human-machine interface 1 can be adapted to operators of different heights via a height adjustment.
  • the control input unit 5 and/or the input unit 17 are slightly tilted backwards and, together with the opposite inclination of a housing of the human-machine interface 1, facilitate the operation of the individual control elements 21 or buttons and the surface 11 of the parameter input unit 7.
  • a travel function of an axis can be carried out directly by activating a permanently assigned control element 21 in the input field 17, for example the "snail forward" button.
  • Some of these buttons are direct travel buttons, which means that when you press them, an immediate (manually operated) movement occurs. However, in the automatic cycle, pressing this button is blocked because pressing it is not plausible, which is visualized by a lack of backlighting.
  • the controls 21 are enabled, backlit and are backlit even brighter when actuated.
  • the input field 17 contains the essential keys of the injection molding machine, so that operation is possible without screen navigation.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manufacturing & Machinery (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Injection Moulding Of Plastics Or The Like (AREA)
  • User Interface Of Digital Computer (AREA)

Claims (13)

  1. Interface homme-machine (1) pour la commande simultanée et interactive d'une fonction de réglage d'une machine (3), comprenant :
    - un écran tactile (23) ;
    - une unité de saisie de réglage (5) actionnable indépendamment de l'écran tactile (23) et qui est conçue pour commander manuellement une fonction de réglage de la machine (3) en fonction de la saisie de réglage (25), et
    - une unité de saisie de paramètre (7) actionnable indépendamment tant de l'unité de saisie de réglage (5) que de l'écran tactile (23),
    - l'unité de saisie de réglage (5) et l'unité de saisie de paramètres (7) étant reliées au moyen d'une interface en temps réel à une commande machine de la machine,
    - l'unité de saisie de paramètre (7) actionnable indépendamment étant conçue pour commander manuellement, en fonction d'une saisie de paramètres (7), simultanément un paramètre de réglage de la fonction de réglage dépendant fonctionnellement de la fonction de réglage, et
    - l'interface homme-machine (1) comprenant une unité d'affichage de paramètre (9) disposée avec l'unité de saisie de paramètre (7) et qui est conçue pour communiquer en retour une valeur de consigne ou une valeur réelle du paramètre de réglage commandé manuellement existant réellement au niveau de la machine (3), l'unité d'affichage de paramètre (9) présentant un affichage à graphique à barres (13) placé le long d'une surface tactile allongée (11) de l'unité de saisie de paramètre (7),
    caractérisée en ce qu'il est prévu latéralement ou des deux côtés de l'affichage à graphique à barres (13) un indicateur lumineux modulable (15) au moyen duquel un état de la machine (3) peut être indiqué.
  2. Interface homme-machine selon la revendication 1, caractérisé en ce que l'affichage à graphique à barres (13) et/ou l'indicateur lumineux (15) est/sont conique(s) et/ou au moins l'indicateur lumineux (15) s'étend en forme de « y » au-dessus de l'interface homme-machine (1).
  3. Interface homme-machine selon la revendication 2, caractérisée par une unité de saisie de choix (42) actionnable indépendamment, au moyen de laquelle la fonction de réglage peut être choisie.
  4. Interface homme-machine selon l'une des revendications précédentes, caractérisée en ce qu'une unité de saisie (17) ayant de préférence des touches physiques est prévue et en ce que l'unité d'affichage de paramètre (9) et/ou l'unité de saisie de paramètre (7) est/sont disposée(s) entre l'unité de saisie (17) et l'unité de saisie de réglage (5).
  5. Interface homme-machine selon la revendication 4, caractérisée en ce que l'unité de saisie (17) et/ou l'unité de saisie de réglage (5) présente(nt) des cylindres en verre (19) encastrés et affleurants en tant qu'éléments d'actionnement (21) sensibles au toucher.
  6. Interface homme-machine selon l'une des revendications 4 et 5, dans laquelle l'unité de saisie (17) est reliée au moyen de l'interface en temps réel à la commande machine de la machine (3).
  7. Procédé de commande interactive d'un paramètre de réglage d'une machine (3) au moyen d'une interface homme-machine (1), comportant :
    - la mise à disposition d'un écran tactile (23),
    - la mise à disposition d'une unité de saisie de réglage (5) de l'interface homme-machine (1) actionnable indépendamment de l'écran tactile (23),
    - la réalisation d'une saisie de réglage (25) manuelle au moyen de l'unité de saisie de réglage mise à disposition (5) en vue d'une commande manuelle d'une fonction de réglage de la machine (3),
    - la mise à disposition d'une unité de saisie de paramètre (7) de l'interface homme-machine (1) actionnable indépendamment de l'écran tactile (23),
    - la réalisation simultanée d'une saisie de paramètre manuelle (27) au moyen de l'unité de saisie de paramètre (7) mise à disposition, en vue d'une commande manuelle d'un paramètre de la fonction de réglage dépendant fonctionnellement de la fonction de réglage,
    - la mise à disposition d'une unité d'affichage de paramètre (9) de l'interface homme-machine (1),
    - la liaison de l'unité de saisie de réglage (5) et de l'unité de saisie de paramètre (7) au moyen d'une interface en temps réel avec une commande machine de la machine (3),
    - la communication en retour simultanée du paramètre de réglage de la fonction de réglage au moyen de l'unité d'affichage de paramètre (9) mise à disposition,
    - la commande simultanée de la fonction de réglage et du paramètre de réglage et ainsi la communication en retour d'une valeur de consigne ou d'une valeur réelle existant réellement au niveau de la machine (3) du paramètre de réglage de la fonction de réglage au moyen d'un affichage à graphiques à barres (13) placé le long d'une surface tactile allongée (11) de l'unité d'affichage de paramètre (9) mise à disposition,
    caractérisé en ce qu'un indicateur lumineux est modulable au moins latéralement des deux côtés de l'affichage à graphique à barres (13), au moyen duquel un état de la machine (3) peut être indiqué.
  8. Procédé selon la revendication 7, caractérisé par :
    la liaison d'une unité de saisie (17) au moyen de l'interface en temps réel avec la commande machine de la machine (3).
  9. Procédé selon la revendication 7 ou 8, caractérisé par :
    - la mise à disposition d'une unité de saisie de choix (42) de l'interface homme-machine (1),
    - la réalisation d'une saisie de choix manuelle (29) au moyen de l'unité de saisie de choix (42) pour le choix de la fonction de réglage,
    - la mise hors tension de l'unité de saisie de réglage (5) pour la réalisation de la saisie de réglage manuelle (25) en fonction de la saisie de choix (29) réalisée.
  10. Procédé selon les revendications 8 et 9, caractérisé par :
    - la communication en retour et/ou la symbolisation de la mise hors tension et/ou de la fonction de réglage choisie au niveau de l'unité de saisie (17) et/ou de l'unité de saisie de réglage (5).
  11. Procédé selon la revendication 9 ou 10, caractérisé par :
    - l'affichage dynamique d'un élément d'actionnement physique (21) de l'unité de saisie de réglage (5) en fonction de la saisie choix manuelle (29),
    - la réalisation de la saisie de réglage manuelle (25) au moyen de l'élément d'actionnement physique (21) de l'unité de saisie de réglage (5).
  12. Procédé selon la revendication 11, caractérisé par :
    - la symbolisation de la fonction de réglage choisie sur, au niveau de ou à côté de l'élément d'actionnement affiché (21) de l'unité de saisie de réglage (5).
  13. Machine de moulage par injection comportant une interface homme-machine (1) selon l'une des revendications 1 à 6 et/ou présentée, conçue, construite et/ou programmée pour la mise en oeuvre d'un procédé selon l'une des revendications 7 à 12.
EP17803792.5A 2016-10-18 2017-10-18 Commande interactive d'une machine dotée d'un retour d'information sur un paramètre de réglage Active EP3542235B1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102016119853.6A DE102016119853A1 (de) 2016-10-18 2016-10-18 Interaktives Steuern einer Maschine mit Rückmeldung eines Stellparameters
PCT/EP2017/076589 WO2018073294A1 (fr) 2016-10-18 2017-10-18 Commande interactive d'une machine dotée d'un retour d'information sur un paramètre de réglage

Publications (3)

Publication Number Publication Date
EP3542235A1 EP3542235A1 (fr) 2019-09-25
EP3542235C0 EP3542235C0 (fr) 2023-11-29
EP3542235B1 true EP3542235B1 (fr) 2023-11-29

Family

ID=60450582

Family Applications (1)

Application Number Title Priority Date Filing Date
EP17803792.5A Active EP3542235B1 (fr) 2016-10-18 2017-10-18 Commande interactive d'une machine dotée d'un retour d'information sur un paramètre de réglage

Country Status (5)

Country Link
US (1) US11150799B2 (fr)
EP (1) EP3542235B1 (fr)
CA (1) CA3040976A1 (fr)
DE (1) DE102016119853A1 (fr)
WO (1) WO2018073294A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210307940A1 (en) * 2018-12-11 2021-10-07 University Of New Brunswick Systems and methods for inductive tuning of human-machine interface

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1273851A2 (fr) * 2001-07-07 2003-01-08 Therma Grossküchen Produktion AG Dispositif de commande pour appareil de cuisson
US20060016800A1 (en) * 2004-07-20 2006-01-26 Massachusetts Institute Of Technology Continuous capacitive slider controller for a smooth surfaced cooktop

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4706676A (en) 1985-02-11 1987-11-17 The United States Of America As Represented By The Secretary Of The Army Dermal substance collection device
US4704676A (en) * 1986-03-24 1987-11-03 The Foxboro Company Method and apparatus for configuring a controller
DE9110348U1 (de) 1991-06-14 1991-11-21 Buhl Automatic Inc., Guelph, Ontario Schaltungsanordnung zur Steuerung einer Maschinenanlage
US6684264B1 (en) * 2000-06-16 2004-01-27 Husky Injection Molding Systems, Ltd. Method of simplifying machine operation
US20040201765A1 (en) * 2001-03-19 2004-10-14 Gammenthaler Robert S. In-car digital video recording with MPEG compression
US6966837B1 (en) * 2001-05-10 2005-11-22 Best Robert M Linked portable and video game systems
US20040021698A1 (en) * 2002-08-05 2004-02-05 Baldwin Amanda K. Intuitive touchscreen interface for a multifunction device and method therefor
DE10334153A1 (de) 2003-07-26 2005-02-24 Karl Hehl Verfahren und Vorrichtung zur interaktiven Steuerung einer Maschine
DE102004051106A1 (de) * 2004-10-19 2006-04-27 Demag Ergotech Gmbh Kunststoffverarbeitende Maschine
WO2007025396A1 (fr) 2005-07-18 2007-03-08 Netstal-Maschinen Ag Procede et dispositif de commande permettant de commander une ou plusieurs machines
DE102005052725B3 (de) * 2005-11-04 2007-06-06 Dr. Boy Gmbh & Co. Kg Bedienelement für eine Spritzgießmaschine
DE102010051639A1 (de) * 2010-11-17 2012-05-24 Netstal-Maschinen Ag Steuerungsvorrichtung mit Multi-Touch Funktionalität
AT511488A3 (de) * 2011-05-16 2014-12-15 Keba Ag Verfahren zur manuell gesteuerten beeinflussung von bewegungen einer maschine oder anlage sowie entsprechende maschinensteuerung
AT511487B1 (de) * 2011-06-09 2016-01-15 Engel Austria Gmbh Bedieneinheit für eine spritzgiessmaschine
US10598388B2 (en) * 2016-04-07 2020-03-24 Electrolux Home Products, Inc. Appliance with electrovibration user feedback in a touch panel interface

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1273851A2 (fr) * 2001-07-07 2003-01-08 Therma Grossküchen Produktion AG Dispositif de commande pour appareil de cuisson
US20060016800A1 (en) * 2004-07-20 2006-01-26 Massachusetts Institute Of Technology Continuous capacitive slider controller for a smooth surfaced cooktop

Also Published As

Publication number Publication date
CA3040976A1 (fr) 2018-04-26
WO2018073294A1 (fr) 2018-04-26
US11150799B2 (en) 2021-10-19
EP3542235C0 (fr) 2023-11-29
DE102016119853A1 (de) 2018-04-19
US20200050354A1 (en) 2020-02-13
EP3542235A1 (fr) 2019-09-25

Similar Documents

Publication Publication Date Title
EP1907906B2 (fr) Procede et dispositif de commande permettant de commander une ou plusieurs machines
DE10340188A1 (de) Bildschirm mit einer berührungsempfindlichen Bedienoberfläche zur Befehlseingabe
WO2008071669A1 (fr) Unité de commande à touches d'écran tactile
AT511488A2 (de) Verfahren zur manuell gesteuerten beeinflussung von bewegungen einer maschine oder anlage sowie entsprechende maschinensteuerung
DE102012016109A1 (de) Bedienvorrichtung zum Einstellen einer Klimatisierungsvorrichtung eines Fahrzeugs und Verfahren hierzu
EP3372435B1 (fr) Procédé et système de commande destinés à fournir une interface utilisateur
EP3234743B1 (fr) Procédé de fonctionnement d'un dispositif de commande d'un véhicule dans des modes de fonctionnement différents, dispositif de commande et véhicule automobile
DE102012219302A1 (de) Anzeigevorrichtung für Fertigungsmaschine
DE102014008040A1 (de) Kraftfahrzeug-Ein-/Ausgabevorrichtung und -verfahren
EP3508967B1 (fr) Procédé de fonctionnement d'une interface homme-machine ainsi qu'interface homme-machine
EP3270278A1 (fr) Procede de fonctionnement d'un systeme de commande et systeme de commande
EP3898310B1 (fr) Procédé et système pour régler la valeur d'un paramètre
EP3542235B1 (fr) Commande interactive d'une machine dotée d'un retour d'information sur un paramètre de réglage
WO2016050390A1 (fr) Pupitre de commande
DE102014000789A1 (de) Werkzeugmaschine mit Displayvorrichtung
DE102018202657A1 (de) Vorrichtung und verfahren zum steuern von fahrzeugfunktionen sowie fahrzeug
EP3755567B1 (fr) Dispositif de commande et procede de controle d'au moins une unite fonctionelle pour un vehicule avec un deplacement visuel d'un symbole de commande
DE102016207611B4 (de) Anzeige- und Bediensystem
DE202012013272U1 (de) Mobile Einrichtung zur computergestützten Ein- und Ausgabe von Daten mit einer integrierten Bildschirmausgabeeinheit
DE102016215005A1 (de) Bedienelement, Infotainment-System und Verfahren zum Steuern eines Infotainment-Systems
EP1177566B1 (fr) Dispositif permettant d'entrer des valeurs et dote d'un ecran
EP3443422B1 (fr) Machine-outil et utilisation d'un écran tactile pour commander un organe d'une machine-outil
EP3268852B1 (fr) Procédé de sélection ciblée d'éléments affichés sur un écran tactile
DE102013015227A1 (de) Bedieneinrichtung für einen Kraftwagen
DE102013007105A1 (de) Bedienvorrichtung

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20190517

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20210707

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

RIC1 Information provided on ipc code assigned before grant

Ipc: G06F 3/0489 20130101ALI20230601BHEP

Ipc: G06F 3/0354 20130101ALI20230601BHEP

Ipc: G06F 3/04847 20220101ALI20230601BHEP

Ipc: B29C 45/76 20060101ALI20230601BHEP

Ipc: G05B 19/409 20060101AFI20230601BHEP

INTG Intention to grant announced

Effective date: 20230614

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 502017015648

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

Free format text: LANGUAGE OF EP DOCUMENT: GERMAN

U01 Request for unitary effect filed

Effective date: 20231216

U07 Unitary effect registered

Designated state(s): AT BE BG DE DK EE FI FR IT LT LU LV MT NL PT SE SI

Effective date: 20240105

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20240301

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20240329

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231129

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20240329

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20240301

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231129

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231129

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231129

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20240229

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231129

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231129

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231129

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231129

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231129

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231129

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231129

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 502017015648

Country of ref document: DE

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT