Nothing Special   »   [go: up one dir, main page]

US20190231045A1 - Interactive animate luggage - Google Patents

Interactive animate luggage Download PDF

Info

Publication number
US20190231045A1
US20190231045A1 US16/190,526 US201816190526A US2019231045A1 US 20190231045 A1 US20190231045 A1 US 20190231045A1 US 201816190526 A US201816190526 A US 201816190526A US 2019231045 A1 US2019231045 A1 US 2019231045A1
Authority
US
United States
Prior art keywords
suitcase
processor
container
appendage
motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/190,526
Inventor
Aili Jian
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US29/578,325 external-priority patent/USD824676S1/en
Application filed by Individual filed Critical Individual
Priority to US16/190,526 priority Critical patent/US20190231045A1/en
Publication of US20190231045A1 publication Critical patent/US20190231045A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45CPURSES; LUGGAGE; HAND CARRIED BAGS
    • A45C7/00Collapsible or extensible purses, luggage, bags or the like
    • A45C7/0018Rigid or semi-rigid luggage
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45CPURSES; LUGGAGE; HAND CARRIED BAGS
    • A45C5/00Rigid or semi-rigid luggage
    • A45C5/03Suitcases
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45CPURSES; LUGGAGE; HAND CARRIED BAGS
    • A45C13/00Details; Accessories
    • A45C13/001Accessories
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45CPURSES; LUGGAGE; HAND CARRIED BAGS
    • A45C5/00Rigid or semi-rigid luggage
    • A45C5/14Rigid or semi-rigid luggage with built-in rolling means
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45CPURSES; LUGGAGE; HAND CARRIED BAGS
    • A45C9/00Purses, Luggage or bags convertible into objects for other use
    • A45C2009/005Purses, Luggage or bags convertible into objects for other use into a vehicle, e.g. scooter
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45CPURSES; LUGGAGE; HAND CARRIED BAGS
    • A45C2200/00Details not otherwise provided for in A45C

Definitions

  • Embodiments of the present disclosure relate generally to luggage, and more specifically, to an animal-like, rolling suitcase with sensory input and audiovisual, physical and tactile output.
  • Embodiments described herein allow for animate and interactive zoolocomotion and zoomimicry of luggage. Importantly, such output may occur without the need for a user to press buttons; instead, such output may be triggered by natural interactions with embodiments described herein, such as when a child strokes a cat causing the cat to purr in enjoyment.
  • FIG. 1 illustrates a circuit block diagram, according to one embodiment of the present disclosure.
  • FIG. 2 illustrates an interactive and animate suitcase, according to one embodiment of the present disclosure.
  • FIG. 3 illustrates a method for driving audiovisual, physical and/or tactile output based on sensory input, according to one embodiment of the present disclosure.
  • references to “preferred” techniques generally mean that the inventor contemplates using those techniques, and thinks those techniques are best for the intended application. This does not exclude other techniques for embodiments described herein, and does not mean that those techniques are necessarily essential or would be preferred in all circumstances.
  • coapproach may refer to the process of movement of one or more objects towards the other objects and/or towards a common center before collision.
  • coapproaching (verb) refers to locomotion during coapproach.
  • zoolocomotion may refer to any animal-like movement of a member or part of a member of the animal kingdom.
  • zoomimicry may refer to non-living objects taking the appearance and/or behavior of a member or part of a member of the animal kingdom.
  • FIG. 1 A first figure.
  • FIG. 1 illustrates a circuit block diagram, according to one embodiment of the present disclosure.
  • Circuit 100 employs bus 102 to electrically connect (1) data processing 105 with (1) sensory input 120 with (3) audiovisual, physical and tactile output 155 with (4) power system 180 .
  • Data processing 105 includes processor 110 , memory 115 .
  • Sensory input 120 includes accelerometer 125 , tactile sensor 130 , microphone 135 , camera 140 , rotation sensor 145 , handle sensor 150 .
  • Audiovisual, physical and tactile output 155 includes vibratory driver 160 , speaker 165 , display 170 , actuator 175 .
  • Power system 180 includes motor/dynamo 185 , power/data cable 190 and battery 195 . The inventors contemplate the connection of elements of circuit 100 in any and all conceivable fashion.
  • Data processing 105 includes processor 110 and memory 115 .
  • processor 110 may execute commands related to sensory input data from sensory input 120 (described herein).
  • processor 110 may execute instructions that may trigger actions by output 155 (described herein).
  • Memory 115 may store, by way of example and not limitation, inputs, commands, outputs or other data.
  • memory 115 may be short-term memory (e.g., random access memory) or long-term data storage (e.g., EEPROM or solid state memory).
  • memory 115 may store sensory input data from sensory input 120 (described herein).
  • memory 115 may store command instructions (e.g., software or firmware) that, when executed by processor 110 , may trigger actions by output 155 (described herein).
  • memory 115 may be used as backup/auxillary data storage for smart devices, laptops and the like.
  • Certain embodiments may include wireless communications circuitry (not shown) to effectuate programmability. This may include Bluetooth, near field communications (NFC), and Wi-Fi circuitry. Moreover, certain embodiment may include location sensing such as GPS coupled to the processor.
  • Sensory input 120 includes accelerometer 125 , tactile sensor 130 , microphone 135 , camera 140 , rotational sensor 145 , handle sensor 150 .
  • Accelerometer 125 may be a 3-axis (i.e., X-, Y- and Z-axis) accelerometer capable of detecting motion of circuit 100 and/or embodiments described herein connected to circuit 100 . In this manner, accelerometer 125 may detect when embodiments described herein undergo motion, and may send telemetry to processor 110 .
  • Tactile sensor 130 may send touch data to processor 110 , which, in turn, causes processor 110 to execute commands described herein.
  • tactile sensor 130 may be located on a suitcase handle (not pictured) or on a suitcase shell (not pictured) and may detect when a user touches tactile sensor 130 .
  • Microphone 135 may record audio and transmit audio data to processor 110 , causing processor 110 to execute commands described herein.
  • processor 110 may cause microphone 135 may monitor audio for ‘key phrases’ spoken by a user.
  • microphone 135 may trigger processor 110 to execute output commands described herein.
  • Camera 140 may record video and transmit video data to processor 110 , causing processor 110 to execute commands described herein.
  • processor 110 may cause camera 140 may monitor video for ‘key gestures’ performed by a user.
  • camera 140 may trigger processor 110 to execute output commands described herein.
  • Rotation sensor 145 may detect rotation of embodiments described herein and send rotational data (i.e., telemetry) to processor 110 .
  • a commercially available rotation sensor 145 may mechanically linked to a wheel (not pictured) and may detect when a user causes the wheel to roll.
  • telemetry may be used to trigger processor 110 to execute output commands described herein.
  • Handle sensor 150 may detect telescopic movement of coaxial shafts.
  • coaxial shafts may be, by way of example and not limitation, a telescoping, extendable handle shaft such as that found on a suitcase.
  • handle sensor 150 may be located proximate to a handle shaft (not pictured) and may detect when a user telescopically extends a handle shaft. This detection may be effectuated using a commercially available proximity sensor or limit switch.
  • movement detected by handle sensor 150 may cause processor 110 to execute output commands described herein.
  • Audiovisual, Physical and Tactile Output 155 (Output 155 )
  • Accelerometer 125 , tactile sensor 130 , microphone 135 , camera 140 , rotational sensor 145 , handle sensor 150 may record sensory input and send sensory input data to data processing 105 .
  • Data processing 105 may send commands to cause audiovisual, physical and tactile output from output 155 as described herein. More specifically, a user's interactions with embodiments described herein may cause movements or reactions that mimic that of animals (i.e., zoolocomotion and zoomimicry, respectively).
  • Vibratory devices such as haptic motion devices and vibratory drivers 160 may cause vibrations of embodiments described herein that are detectable by a user.
  • Vibratory driver 160 may be a solid-state vibratory driver or any known vibratory driver.
  • vibratory driver 160 may mimic the vibrations of an animal's body when “purring” occurs (e.g., a cat's purring can be felt as vibrations by a user upon being petted).
  • purring e.g., a cat's purring can be felt as vibrations by a user upon being petted.
  • the inventors contemplate embodiments described herein conducting any and all types of zoolocomotion and zoomimicry.
  • Embodiments described herein provide for inputs from sensory input 120 to trigger outputs from output 155 .
  • tactile sensor 130 may detect a “petting” motion (e.g., such as when one strokes a pet affectionately) and may communicate touch data to data processing 105 . Said petting motion may be detected by detecting and tracking contact movement across a surface.
  • data processing 105 may trigger vibratory driver 160 .
  • tactile sensor 130 may trigger vibratory driver 160 using zoomimicry of embodiments described herein.
  • zoomimicry in this example may take the appearance of “purring” (such as that conducted by a cat in response to being petted).
  • embodiments described herein provide for zoomimicking reactions triggered by user stimuli.
  • speaker 165 connected to circuit 100 , may be employed to play sounds based on commands generated by processor 110 .
  • sounds may include “purring,” any zoomimicry or known audio file.
  • these commands may occur in reaction to one or more of: data from handle sensor 150 , tactile sensor 130 , rotation sensor 145 or telemetry from accelerometer 125 .
  • a user's petting motion detected by tactile sensor 130 or other sensory input 120 may cause speaker 165 to play a “purring sound” in either isolation or conducted with other output (i.e., simultaneous with vibrations caused by vibratory driver 160 ).
  • speaker 165 may play a “purring sound” in either isolation or conducted with other output (i.e., simultaneous with vibrations caused by vibratory driver 160 ).
  • embodiments described herein may more realistically zoomimic animal reactions to user stimuli.
  • Display 170 may consist of any known visual display such as, by way of example and not limitation, one or more LEDs or a LCD display. In one embodiment, display 170 may take the form of a collection of LEDs. By way of example and not limitation, display 170 may be a “heart-shaped” formation of multi-colored LEDs capable of varying illumination levels and color displays.
  • display 170 may “pulse” in response to commands output by processor 110 in reaction to movement detected by accelerometer 125 .
  • pulsing may also take the form of changing shapes and/or colors, occurring at stable or varying frequencies.
  • the pulsing may be fashioned after an animal heartbeat.
  • a user may interact with embodiments described herein in a manner detectable by sensory input 120 , causing display 170 to change in response.
  • accelerometer 125 or other sensory input 120 may detect movement of embodiments described herein, causing display 170 to appear as a “beating heart” in a fantastical representation of life.
  • a user may “pet,” “shake,” or roll embodiments described herein, causing embodiments described herein to appear “awake” or “alive” to a user.
  • the “heartbeat” described herein may change frequency depending on a rolling speed detected by rotation sensor 145 or other sensory input 120 .
  • rolling embodiments described herein faster may cause a faster “heartbeat.”
  • display 170 may cause embodiments described herein to take on a fantastical appearance of “excitement.”
  • Actuator 175 may rotate, oscillate or otherwise actuate in response to commands output by data processing 105 . This actuation may occur in reaction to data recorded by sensory input 120 .
  • appendages as described herein may be attached to actuator 175 such that the appendages undergo movement driven by actuator 175 in reaction to telemetry from accelerometer 125 .
  • These appendages may mimic animal shapes; however, the inventors contemplate that the appendages may take on any form or shape.
  • appendages may be attached to one or more actuators 175 to give the appearance of “cat ears.” Further in this example, user stimuli detected by sensory input 120 may cause actuators 175 and attached appendages to execute zoolocomotion.
  • a user's petting motion detected by tactile sensor 130 or other sensory input 120 may cause one or more actuators 175 to “wiggle,” by way of example and not limitation, appendages in the form of “cat ears.”
  • “petting” embodiments described herein may cause a “lifelike” reaction in the form of “twitching” or “twisting” cat ears in a “perkily attentive” manner. Said wiggling may be effectuate by having the “ears” magnetically coupled to a support plate, which in turn is rotated of re-positioned under programmatic control. More information on such zoolocomotion is provided herein.
  • Magnetically coupling appendages allows for changing to different ears or shapes to provide different effects.
  • magnetically coupling appendages allows the appendages to be removed for storing or shipping the suitcase.
  • actuator 175 may occur in either isolation or conducted with other output. Furthering examples used herein, wiggling appendages caused by actuation of one or more actuators 175 may occur simultaneously with vibrations caused by vibratory driver 160 and/or “purring” sounds played by speaker 165 . In this manner, embodiments described herein may more realistically zoomimic animal reactions to user stimuli. Again, output possibilities from output 155 are not limited to lifelike representations. Furthering examples still, display 170 may play visual output simultaneously with one or more of the output examples given herein, thus fantastically enhancing output and/or animations described herein. By way of example and not limitation, display 170 may display a pulsing heart in response to a user petting embodiments described herein.
  • Power systems 180 include motor/dynamo 185 , power/data cable 190 and battery 195 .
  • Motor/dynamo 185 may be located in wheels (not pictured) to cause movement of embodiments described herein, or to generate electricity by acting as a dynamo when embodiments described herein are moved.
  • motor/dynamo 185 may be mechanically linked to rotation sensor 140 .
  • Motor/dynamo 185 may also cause locomotion of embodiments described herein in response to sensory input.
  • motor/dynamo 185 may cause embodiments described herein to zoolocomote. By way of example and not limitation, such zoolocomotion may mimic the way a pet follows a person (e.g., a cat follows an owner).
  • Power/data cable 190 may be connected to circuit 100 and may be employed to power or recharge one or more elements of circuit 100 .
  • Power/data cable 190 may be retractable or concealable with a cover plate as known.
  • Battery 195 may be connected to circuit 100 and may be employed to power one or more elements of circuit 100 .
  • Battery 195 may be any known battery including rechargeable-type batteries. In one embodiment, battery 195 may be recharged by embodiments described herein, including but not limited to: motor/dynamo 185 and power/data cable 190 .
  • Power/data cable 190 may be of any cable type, including USB.
  • power/data cable 190 may be compatible with smart devices for multiple purposes including but limited to: recharging smart devices and backing up/storing data for smart devices.
  • Any power/data cables 190 such as a USB cable, has myriad uses in the art and the inventors contemplate all such uses.
  • Wired or wireless communications may provide Internet connectivity, data transfer capability and access to cloud data storage.
  • Data storage capabilities as described herein e.g., memory 115
  • wireless backup or wired backup e.g., through power/data cable 190
  • radio modules such as GPS, Bluetooth and the like may be included in some embodiments to allow for enhanced operations such as the ability to transmit, to a server, location information, or to effectuate certain motions in response to specific location information.
  • any data from embodiments described herein may be recorded and uploaded to cloud-based Internet storage.
  • the inventors contemplate storing and sharing of this data on the Internet (e.g., social media websites) with Internet users, e.g., social media users and/or other owners of embodiments described herein, such that these Internet users may be aware of each other's shared data.
  • users of embodiments described herein may share data related to embodiments described herein with each other, e.g., compare telemetry.
  • users may download certain program instructions to alter or enhance the operations described herein.
  • FIG. 2 illustrates an interactive and animate suitcase, according to one embodiment of the present disclosure.
  • View 200 shows a front three-quarters profile of suitcase 205 .
  • Suitcase 205 includes wheels 210 , tactile sensor 215 , one or more appendages 220 , display 225 , casters 230 , handle 255 , and actuators 275 .
  • Suitcase 205 may be of impact-resistant or ballistic material as known.
  • Suitcase 205 may be water resistant or waterproof (e.g., IP68g or IP69k) as known.
  • Suitcase 205 may contain multiple, isolated compartments as known.
  • Suitcase 205 may be a clamshell or foldable design as known.
  • Suitcase 205 may a hardshell, soft fabric, or hybrid design as known.
  • Suitcase 205 may have zippers, clasps, buttons, magnetic enclosures and may seal closed by any known means.
  • Some embodiments may include handles 255 made from clear plastic providing for certain illumination affects. This may be effectuated by placing a light source, such as one or more LEDs (not shown), at the base of the handle 255 . With the LEDs under programmatic control, the color, and light intensity may be varied for certain movements or in response to certain stimuli.
  • a light source such as one or more LEDs (not shown)
  • suitcase 205 may include speakers, handle sensors and/or vibratory drivers (not shown) as described herein.
  • casters 230 may be fixed- or spinner-style, as known.
  • wheels 210 and/or casters 230 may be of varying opacity (e.g., transparent).
  • casters 230 and/or wheels 210 may have be transparent with an opaque portion to give the appearance of “paws” or “feet.”
  • Tactile sensor 215 may detect physical contact by a user as described herein. Tactile sensor 215 may transmit sensory data to a processor. In one embodiment, tactile sensor 215 may detect, by way of example and not limitation, “petting” of suitcase 205 by a user in a similar manner that a user may pet an animal. While “petting” of tactile sensor 215 is provided as an example, the inventors contemplate any physical interaction with tactile sensor 215 to cause tactile sensor 215 to transmit sensory data to the processor.
  • tactile sensor 215 is positioned on suitcase 205 near appendages 220 , however, the inventors contemplate positioning one or more tactile sensors 230 on any location on suitcase 205 . Furthermore, tactile sensor 215 may take any shape and occupy any surface of suitcase 205 .
  • a rotation sensor may be mechanically linked with one or more wheels 210 in order to detect rotation of wheels 210 , and thus locomotion of suitcase 205 by a user.
  • rotation sensor may send telemetry to a processor.
  • the rotation sensor may be mounted coaxially to, or otherwise mechanically engaged with, one or more wheels 210 such that wheels 210 supply rotational drive to the rotation sensors by known means.
  • An accelerometer may be employed in suitcase 205 to detect movement of embodiments described herein, or physical contact of embodiments described herein by a user.
  • the accelerometer may transmit telemetry to a processor.
  • the accelerometer may detect rolling of suitcase 205 by a user, “petting” or shaking of suitcase 205 by a user.
  • a microphone and/or camera may be located proximate to suitcase 205 to record audiovisual data. Audiovisual data may be sent to a processor and may in turn cause audiovisual, physical or tactile output as described herein.
  • Handle 255 may allow a user to drag or push suitcase 205 .
  • Handle 255 may be a fixed- or telescoping-style handle as known.
  • Handle 255 may also contain a handle sensor (not pictured) that sends information related to handle movement to a processor.
  • handle 255 when deployed or retracted by a user, may trigger activation/deactivation of embodiments described herein. In further embodiments, such triggering may reduce charge depletion and improve battery performance.
  • appendages 220 appear as “cat ears,” but appendages 220 may, by way of example and not limitation, take any zoomorphic form and the inventors contemplate appendages 220 taking the form of any known three-dimensional shape. Also as illustrated, appendages 220 are located, by way of example and not limitation, on top of suitcase 205 . However, the inventors contemplate placing appendages 200 any location proximate to suitcase 205 . In one embodiment, appendages 220 may be removable and stored in suitcase 205 to allow suitcase 205 to occupy an overall smaller volume profile. In this manner, such a reduced footprint may allow for easier storage in airplane overhead compartments and better compatibility with passenger airline regulations.
  • Appendages 220 may be driven by actuators 275 , as shown in top front (cutaway) view 270 . Under the power of actuators 275 , appendages 220 may “move,” or more specifically, appendages 220 may coapproach or oscillate. By way of example and not limitation, appendages 220 may “twitch,” “wiggle,” or “twist,” in a manner similar to the motion of the ears of a cat. Note, however, that the inventors contemplate any and all movements physically possible of appendages 220 by actuators 220 . By way of example and not limitation, such movements may take the form of zoolocomotion. Movement of appendages 220 may occur in reaction to accelerometer/rotation sensor telemetry, handle or tactile sensor 215 data being received and processed by a processor.
  • Display 225 may be lit by a power source (not shown) in suitcase 205 . As illustrated, Display 225 takes the form of LEDs deployed in a “heart” formation on suitcase 205 , but the inventors contemplate that display 225 may take any shape or be of any number, and be located on any position on suitcase 205 . LEDs are illustrated only by way of example and not limitation. Thus, embodiments described herein provide for any known video displays, such as LCD screens and the like. Furthermore, the inventors contemplate the display of any and all visual media on display 225 . Furthermore, the inventors contemplate a processing updating or otherwise changing this visual media based on sensory input data as provided herein. In one embodiment, display 225 may illuminate in reaction to sensory input as described herein.
  • a speaker may cause sound in reaction to accelerometer/rotation sensor telemetry, microphone, camera, handle sensor or tactile sensor 215 data being received and processed by a processor as described herein.
  • the speaker may play music, animal sounds, however, the inventors contemplate any and all audio output known.
  • a vibratory driver (not shown) may be employed to cause vibration of suitcase 205 or parts of the suitcase according to some embodiments described herein.
  • the vibratory driver may vibrate suitcase 205 in reaction to accelerometer/rotation sensor telemetry, handle sensor or tactile sensor 215 data being received and processed by a processor.
  • output by appendages 220 , display 225 , a vibratory driver or speaker may occur in reaction to data from a user “petting” suitcase 205 , rolling or shaking suitcase 205 , using (e.g., deploying/retracting) handle 255 , or any other stimuli.
  • a user interfacing with suitcase 205 may cause output by appendages 220 , display 225 , a vibratory driver or speaker to imitate a “reaction” by suitcase 205 .
  • zoolocomotory/zoomimical examples have been provided, the inventors contemplate any and all possible movements as reactions by suitcase 205 to user stimuli.
  • the speed and/or frequency of output by appendages 220 , display 225 , a vibratory driver or speaker may vary in reaction to user stimuli.
  • pulling up on handle 255 may cause embodiments described herein to appear to “wake up,” e.g., triggering a “heartbeat” on display 225 , movement of appendages 220 , or causing audiovisual, physical or tactical output as described herein.
  • Suitcase 205 may include one or more of: a power cable (not pictured) or a battery (not pictured) as described herein.
  • One or more wheels 210 may include one or more motor/dynamos (not pictured).
  • Motor/dynamos may recharge a battery (not shown) or provide locomotion to suitcase 205 , as described herein.
  • Motor/dynamos may also cause suitcase 205 to move in response to sensory input as described herein.
  • motory/dynamos may cause suitcase 205 to zoolocomote (by way of example and not limitation: “follow” a user in manner similar to a pet following an owner) based on commands from a processor and/or sensory input as described herein.
  • FIG. 3 illustrates a method for driving audiovisual, physical and/or tactile output based on sensory input, according to one embodiment of the present disclosure.
  • sensors may be tactile sensors, handle sensors, rotational sensors or accelerometers as described herein.
  • sensory input for a tactile sensor may take the form of touch data, such as if a user were to “pet” the tactile sensor.
  • a handle sensor may send a handle trigger output when, by way of example and not limitation, deployment of a telescoping suitcase handle is detected by a handle sensor.
  • an accelerometer or a rotational sensor mechanically linked to a wheel may send telemetry related to movement of embodiments described herein.
  • sensory input data is transmitted to a processor.
  • the processor may execute instructions from software stored in memory in response to and/or dependent on sensory input data.
  • software may take the form of firmware, software loaded in short- or long-term data storage as described herein, or Internet/cloud-stored information.
  • the processor may send commands based on the instructions to an audio/visual physical or tactile output system (output system).
  • an output system may include one or more of the following: motor/dynamos, vibratory drivers, displays and speakers as described herein.
  • a command sent by the processor to a motor/dynamo may cause voltage applied to a motor/dynamo, causing a suitcase to zoolocomote, as described herein.
  • commands to a vibratory driver may cause voltage applied to a vibratory driver, causing a suitcase to zoomimick a cat (e.g., “purring”), as described herein.
  • commands sent to a speaker may be formed as audio data signals causing a suitcase to zoommick “purring” sounds from a speaker, as described herein.
  • commands sent to a display may take the form of voltage applied to an LED or video data sent to a display, causing pulsation/illumination of a heart formation of LEDs or fantastical representations of lifelike appearances and animations as described herein.

Landscapes

  • Toys (AREA)

Abstract

A suitcase that, in certain embodiments, allow for animate and interactive zoolocomotion and zoomimicry of luggage. Importantly, such output may occur without the need for a user to press buttons; instead, such output may be triggered by natural interactions with embodiments described herein, such as when a child strokes a cat causing the cat to purr in enjoyment.

Description

    PRIORITY
  • This application claims the benefit of co-pending U.S. provisional application 62/587,211 filed Nov. 16, 2017 by the same inventors which is included by reference as if fully set forth herein. This application is also a continuation-in-part of co-pending application Ser. No. 29/617,217 filed Sep. 9, 2017, which in turn is a continuation of U.S. Pat. D824,676 issued Aug. 7, 2018.
  • BACKGROUND Field of Invention
  • Embodiments of the present disclosure relate generally to luggage, and more specifically, to an animal-like, rolling suitcase with sensory input and audiovisual, physical and tactile output.
  • Description of Related Art
  • Currently, luggage is inanimate and lacks interactivity with owners. Thus owners are unlikely to be emotionally attached to their luggage. This is especially true with children. Because of this dearth, luggage is merely borne by tired, disinterested users, dragged and bumped along through hotels and airports all over the world. Clearly there is a need for interactive, animate and entertaining luggage.
  • SUMMARY
  • Embodiments described herein allow for animate and interactive zoolocomotion and zoomimicry of luggage. Importantly, such output may occur without the need for a user to press buttons; instead, such output may be triggered by natural interactions with embodiments described herein, such as when a child strokes a cat causing the cat to purr in enjoyment.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 illustrates a circuit block diagram, according to one embodiment of the present disclosure.
  • FIG. 2 illustrates an interactive and animate suitcase, according to one embodiment of the present disclosure.
  • FIG. 3 illustrates a method for driving audiovisual, physical and/or tactile output based on sensory input, according to one embodiment of the present disclosure.
  • DETAILED DESCRIPTION Generality of Invention
  • This application should be read in the most general possible form. This includes, without limitation, the following:
  • References to specific techniques include alternative and more general techniques, especially when discussing aspects of embodiments described herein, or how the embodiment might be made or used.
  • References to “preferred” techniques generally mean that the inventor contemplates using those techniques, and thinks those techniques are best for the intended application. This does not exclude other techniques for embodiments described herein, and does not mean that those techniques are necessarily essential or would be preferred in all circumstances.
  • References to contemplated causes and effects for some implementations do not preclude other causes or effects that might occur in other implementations.
  • References to reasons for using particular techniques do not preclude other reasons or techniques, even if completely contrary, where circumstances would indicate that the stated reasons or techniques are not as applicable.
  • Furthermore, embodiments described herein are in no way limited to the specifics of any particular embodiments and examples disclosed herein. Many other variations are possible which remain within the content, scope and spirit of embodiments described herein, and these variations would become cl to those skilled in the art after perusal of this application.
  • More detail may be found in the attached appendix, which is incorporated by reference as if fully set forth herein.
  • Glossary
  • As used herein, “coapproach” (noun) may refer to the process of movement of one or more objects towards the other objects and/or towards a common center before collision. As used herein, “coapproaching” (verb) refers to locomotion during coapproach.
  • As used herein, “zoolocomotion” (noun), “zoolocomote” (verb), and zoolocomotory (adjective) may refer to any animal-like movement of a member or part of a member of the animal kingdom.
  • As used herein, “zoomimicry” (noun), “zoomimical” (adjective), and “zoomimic/zoomimicking” (verb) may refer to non-living objects taking the appearance and/or behavior of a member or part of a member of the animal kingdom.
  • FIG. 1
  • FIG. 1 illustrates a circuit block diagram, according to one embodiment of the present disclosure. Circuit 100 employs bus 102 to electrically connect (1) data processing 105 with (1) sensory input 120 with (3) audiovisual, physical and tactile output 155 with (4) power system 180. Data processing 105 includes processor 110, memory 115. Sensory input 120 includes accelerometer 125, tactile sensor 130, microphone 135, camera 140, rotation sensor 145, handle sensor 150. Audiovisual, physical and tactile output 155 includes vibratory driver 160, speaker 165, display 170, actuator 175. Power system 180 includes motor/dynamo 185, power/data cable 190 and battery 195. The inventors contemplate the connection of elements of circuit 100 in any and all conceivable fashion.
  • Data Processing 105
  • Data processing 105 includes processor 110 and memory 115. In one embodiment, processor 110 may execute commands related to sensory input data from sensory input 120 (described herein). In another embodiment, processor 110 may execute instructions that may trigger actions by output 155 (described herein).
  • Memory 115 may store, by way of example and not limitation, inputs, commands, outputs or other data. By way of example and not limitation, memory 115 may be short-term memory (e.g., random access memory) or long-term data storage (e.g., EEPROM or solid state memory). In one embodiment, memory 115 may store sensory input data from sensory input 120 (described herein). In another embodiment, memory 115 may store command instructions (e.g., software or firmware) that, when executed by processor 110, may trigger actions by output 155 (described herein). In an additional embodiment, memory 115 may be used as backup/auxillary data storage for smart devices, laptops and the like.
  • Certain embodiments may include wireless communications circuitry (not shown) to effectuate programmability. This may include Bluetooth, near field communications (NFC), and Wi-Fi circuitry. Moreover, certain embodiment may include location sensing such as GPS coupled to the processor.
  • Sensory Input 120
  • Sensory input 120 includes accelerometer 125, tactile sensor 130, microphone 135, camera 140, rotational sensor 145, handle sensor 150. Accelerometer 125 may be a 3-axis (i.e., X-, Y- and Z-axis) accelerometer capable of detecting motion of circuit 100 and/or embodiments described herein connected to circuit 100. In this manner, accelerometer 125 may detect when embodiments described herein undergo motion, and may send telemetry to processor 110.
  • Tactile sensor 130 may send touch data to processor 110, which, in turn, causes processor 110 to execute commands described herein. In some embodiments, tactile sensor 130 may be located on a suitcase handle (not pictured) or on a suitcase shell (not pictured) and may detect when a user touches tactile sensor 130.
  • Microphone 135 may record audio and transmit audio data to processor 110, causing processor 110 to execute commands described herein. In one embodiment, processor 110 may cause microphone 135 may monitor audio for ‘key phrases’ spoken by a user. In this example, when a ‘key phrase’ is spoken and detected by microphone 135, microphone 135 may trigger processor 110 to execute output commands described herein.
  • Camera 140 may record video and transmit video data to processor 110, causing processor 110 to execute commands described herein. In one embodiment, processor 110 may cause camera 140 may monitor video for ‘key gestures’ performed by a user. In this example, when a ‘key gesture’ is performed and detected by camera 140, camera 140 may trigger processor 110 to execute output commands described herein.
  • Rotation sensor 145 may detect rotation of embodiments described herein and send rotational data (i.e., telemetry) to processor 110. In one embodiment, a commercially available rotation sensor 145 may mechanically linked to a wheel (not pictured) and may detect when a user causes the wheel to roll. In a further embodiment, telemetry may used to trigger processor 110 to execute output commands described herein.
  • Handle sensor 150 may detect telescopic movement of coaxial shafts. Such coaxial shafts may be, by way of example and not limitation, a telescoping, extendable handle shaft such as that found on a suitcase. In one embodiment, handle sensor 150 may be located proximate to a handle shaft (not pictured) and may detect when a user telescopically extends a handle shaft. This detection may be effectuated using a commercially available proximity sensor or limit switch. In a further embodiment, movement detected by handle sensor 150 may cause processor 110 to execute output commands described herein.
  • Audiovisual, Physical and Tactile Output 155 (Output 155)
  • Accelerometer 125, tactile sensor 130, microphone 135, camera 140, rotational sensor 145, handle sensor 150 (together, sensory input 120), may record sensory input and send sensory input data to data processing 105. Data processing 105 may send commands to cause audiovisual, physical and tactile output from output 155 as described herein. More specifically, a user's interactions with embodiments described herein may cause movements or reactions that mimic that of animals (i.e., zoolocomotion and zoomimicry, respectively).
  • Vibratory devices such as haptic motion devices and vibratory drivers 160 may cause vibrations of embodiments described herein that are detectable by a user. Vibratory driver 160 may be a solid-state vibratory driver or any known vibratory driver. In one embodiment, vibratory driver 160 may mimic the vibrations of an animal's body when “purring” occurs (e.g., a cat's purring can be felt as vibrations by a user upon being petted). However, the inventors contemplate embodiments described herein conducting any and all types of zoolocomotion and zoomimicry.
  • Embodiments described herein provide for inputs from sensory input 120 to trigger outputs from output 155. By way of example and not limitation tactile sensor 130 may detect a “petting” motion (e.g., such as when one strokes a pet affectionately) and may communicate touch data to data processing 105. Said petting motion may be detected by detecting and tracking contact movement across a surface. In turn, data processing 105 may trigger vibratory driver 160. In this manner, tactile sensor 130 may trigger vibratory driver 160 using zoomimicry of embodiments described herein. By way of example and not limitation, zoomimicry in this example may take the appearance of “purring” (such as that conducted by a cat in response to being petted). Thus, embodiments described herein provide for zoomimicking reactions triggered by user stimuli.
  • In an optional embodiment, speaker 165, connected to circuit 100, may be employed to play sounds based on commands generated by processor 110. Such sounds may include “purring,” any zoomimicry or known audio file. Moreover, these commands may occur in reaction to one or more of: data from handle sensor 150, tactile sensor 130, rotation sensor 145 or telemetry from accelerometer 125.
  • Continuing with examples provided herein, a user's petting motion detected by tactile sensor 130 or other sensory input 120 may cause speaker 165 to play a “purring sound” in either isolation or conducted with other output (i.e., simultaneous with vibrations caused by vibratory driver 160). In this manner, embodiments described herein may more realistically zoomimic animal reactions to user stimuli.
  • Display 170 may consist of any known visual display such as, by way of example and not limitation, one or more LEDs or a LCD display. In one embodiment, display 170 may take the form of a collection of LEDs. By way of example and not limitation, display 170 may be a “heart-shaped” formation of multi-colored LEDs capable of varying illumination levels and color displays.
  • While zoomimical and zoolocomotory examples have been given, the inventors contemplate that embodiments described herein are not limited to realistic forms and movements. Indeed, fantastical shapes and actions are contemplated by the inventors. A fantastical example follows: display 170 may “pulse” in response to commands output by processor 110 in reaction to movement detected by accelerometer 125. By way of example and not limitation, such pulsing may also take the form of changing shapes and/or colors, occurring at stable or varying frequencies. Moreover, the pulsing may be fashioned after an animal heartbeat.
  • Accordingly, a user may interact with embodiments described herein in a manner detectable by sensory input 120, causing display 170 to change in response. By way of example and not limitation, accelerometer 125 or other sensory input 120 may detect movement of embodiments described herein, causing display 170 to appear as a “beating heart” in a fantastical representation of life. For example, a user may “pet,” “shake,” or roll embodiments described herein, causing embodiments described herein to appear “awake” or “alive” to a user.
  • In a further example, the “heartbeat” described herein may change frequency depending on a rolling speed detected by rotation sensor 145 or other sensory input 120. In an even further example, rolling embodiments described herein faster may cause a faster “heartbeat.” In this manner, display 170 may cause embodiments described herein to take on a fantastical appearance of “excitement.”
  • Actuator 175 may rotate, oscillate or otherwise actuate in response to commands output by data processing 105. This actuation may occur in reaction to data recorded by sensory input 120. In a further embodiment, appendages as described herein (not pictured) may be attached to actuator 175 such that the appendages undergo movement driven by actuator 175 in reaction to telemetry from accelerometer 125. These appendages may mimic animal shapes; however, the inventors contemplate that the appendages may take on any form or shape.
  • By way of example and not limitation, appendages may be attached to one or more actuators 175 to give the appearance of “cat ears.” Further in this example, user stimuli detected by sensory input 120 may cause actuators 175 and attached appendages to execute zoolocomotion.
  • Continuing with examples provided herein, a user's petting motion detected by tactile sensor 130 or other sensory input 120 may cause one or more actuators 175 to “wiggle,” by way of example and not limitation, appendages in the form of “cat ears.” Thus, “petting” embodiments described herein may cause a “lifelike” reaction in the form of “twitching” or “twisting” cat ears in a “perkily attentive” manner. Said wiggling may be effectuate by having the “ears” magnetically coupled to a support plate, which in turn is rotated of re-positioned under programmatic control. More information on such zoolocomotion is provided herein. Magnetically coupling appendages allows for changing to different ears or shapes to provide different effects. Moreover, magnetically coupling appendages allows the appendages to be removed for storing or shipping the suitcase.
  • The inventors contemplate that such actuation by actuator 175 may occur in either isolation or conducted with other output. Furthering examples used herein, wiggling appendages caused by actuation of one or more actuators 175 may occur simultaneously with vibrations caused by vibratory driver 160 and/or “purring” sounds played by speaker 165. In this manner, embodiments described herein may more realistically zoomimic animal reactions to user stimuli. Again, output possibilities from output 155 are not limited to lifelike representations. Furthering examples still, display 170 may play visual output simultaneously with one or more of the output examples given herein, thus fantastically enhancing output and/or animations described herein. By way of example and not limitation, display 170 may display a pulsing heart in response to a user petting embodiments described herein.
  • Power Systems 180
  • Power systems 180 include motor/dynamo 185, power/data cable 190 and battery 195. Motor/dynamo 185 may be located in wheels (not pictured) to cause movement of embodiments described herein, or to generate electricity by acting as a dynamo when embodiments described herein are moved. In one embodiment, motor/dynamo 185 may be mechanically linked to rotation sensor 140. Motor/dynamo 185 may also cause locomotion of embodiments described herein in response to sensory input. In one embodiment, motor/dynamo 185 may cause embodiments described herein to zoolocomote. By way of example and not limitation, such zoolocomotion may mimic the way a pet follows a person (e.g., a cat follows an owner).
  • Power/data cable 190 may be connected to circuit 100 and may be employed to power or recharge one or more elements of circuit 100. Power/data cable 190 may be retractable or concealable with a cover plate as known. Battery 195 may be connected to circuit 100 and may be employed to power one or more elements of circuit 100. Battery 195 may be any known battery including rechargeable-type batteries. In one embodiment, battery 195 may be recharged by embodiments described herein, including but not limited to: motor/dynamo 185 and power/data cable 190.
  • Power/data cable 190 may be of any cable type, including USB. The inventors contemplate that power/data cable 190 may feed power into other devices to recharge them or transmit/receive data. In one embodiment, power/data cable 190 may be compatible with smart devices for multiple purposes including but limited to: recharging smart devices and backing up/storing data for smart devices. In one embodiment, the inventors contemplate the usage of embodiments described herein (e.g., rolling a suitcase) to charge battery 195, and in turn, battery 195 may be used to recharge a user's smart device through power/data cable 190. Any power/data cables 190, such as a USB cable, has myriad uses in the art and the inventors contemplate all such uses.
  • Data Communications System
  • While not pictured, the inventors also contemplate the addition of a communications system (not pictured), including but not limited to: wireless or wired communications (e.g., Wi-Fi, 3G, 4G or LTE data communications) as known. Wired or wireless communications may provide Internet connectivity, data transfer capability and access to cloud data storage. Data storage capabilities as described herein (e.g., memory 115) may allow for wireless backup or wired backup (e.g., through power/data cable 190) for smart devices, laptops and the like.
  • Conventional radio modules such as GPS, Bluetooth and the like may be included in some embodiments to allow for enhanced operations such as the ability to transmit, to a server, location information, or to effectuate certain motions in response to specific location information.
  • The inventors also contemplate that any data from embodiments described herein (e.g., sensory input data, telemetry and the like, or any data as described herein) may be recorded and uploaded to cloud-based Internet storage. Furthermore, the inventors contemplate storing and sharing of this data on the Internet (e.g., social media websites) with Internet users, e.g., social media users and/or other owners of embodiments described herein, such that these Internet users may be aware of each other's shared data. By way of example and not limitation, users of embodiments described herein may share data related to embodiments described herein with each other, e.g., compare telemetry. Moreover, users may download certain program instructions to alter or enhance the operations described herein.
  • The above illustration provides many different embodiments for implementing different features of embodiments described herein. Specific embodiments of components and processes are described to help clarify embodiments described herein. These are, of course, merely embodiments and are not intended to limit embodiments described herein from that described in the claims.
  • FIG. 2
  • FIG. 2 illustrates an interactive and animate suitcase, according to one embodiment of the present disclosure. View 200 shows a front three-quarters profile of suitcase 205. Suitcase 205 includes wheels 210, tactile sensor 215, one or more appendages 220, display 225, casters 230, handle 255, and actuators 275.
  • Suitcase 205 may be of impact-resistant or ballistic material as known. Suitcase 205 may be water resistant or waterproof (e.g., IP68g or IP69k) as known. Suitcase 205 may contain multiple, isolated compartments as known. Suitcase 205 may be a clamshell or foldable design as known. Suitcase 205 may a hardshell, soft fabric, or hybrid design as known. Suitcase 205 may have zippers, clasps, buttons, magnetic enclosures and may seal closed by any known means.
  • Some embodiments may include handles 255 made from clear plastic providing for certain illumination affects. This may be effectuated by placing a light source, such as one or more LEDs (not shown), at the base of the handle 255. With the LEDs under programmatic control, the color, and light intensity may be varied for certain movements or in response to certain stimuli.
  • Optionally, suitcase 205 may include speakers, handle sensors and/or vibratory drivers (not shown) as described herein. In some embodiment, casters 230 may be fixed- or spinner-style, as known. In other embodiments, wheels 210 and/or casters 230 may be of varying opacity (e.g., transparent). In one embodiment, casters 230 and/or wheels 210 may have be transparent with an opaque portion to give the appearance of “paws” or “feet.”
  • Sensory Input
  • Tactile sensor 215 may detect physical contact by a user as described herein. Tactile sensor 215 may transmit sensory data to a processor. In one embodiment, tactile sensor 215 may detect, by way of example and not limitation, “petting” of suitcase 205 by a user in a similar manner that a user may pet an animal. While “petting” of tactile sensor 215 is provided as an example, the inventors contemplate any physical interaction with tactile sensor 215 to cause tactile sensor 215 to transmit sensory data to the processor.
  • As illustrated, tactile sensor 215 is positioned on suitcase 205 near appendages 220, however, the inventors contemplate positioning one or more tactile sensors 230 on any location on suitcase 205. Furthermore, tactile sensor 215 may take any shape and occupy any surface of suitcase 205.
  • A rotation sensor (not shown) may be mechanically linked with one or more wheels 210 in order to detect rotation of wheels 210, and thus locomotion of suitcase 205 by a user. In turn, rotation sensor may send telemetry to a processor. The rotation sensor may be mounted coaxially to, or otherwise mechanically engaged with, one or more wheels 210 such that wheels 210 supply rotational drive to the rotation sensors by known means.
  • An accelerometer (not shown) may be employed in suitcase 205 to detect movement of embodiments described herein, or physical contact of embodiments described herein by a user. Thus, the accelerometer may transmit telemetry to a processor. By way of example and not limitation, the accelerometer may detect rolling of suitcase 205 by a user, “petting” or shaking of suitcase 205 by a user.
  • Finally, a microphone and/or camera (not shown) may be located proximate to suitcase 205 to record audiovisual data. Audiovisual data may be sent to a processor and may in turn cause audiovisual, physical or tactile output as described herein.
  • Handle 255, as shown in rear three-quarters view 250, may allow a user to drag or push suitcase 205. Handle 255 may be a fixed- or telescoping-style handle as known. Handle 255 may also contain a handle sensor (not pictured) that sends information related to handle movement to a processor. In one embodiment, handle 255, when deployed or retracted by a user, may trigger activation/deactivation of embodiments described herein. In further embodiments, such triggering may reduce charge depletion and improve battery performance.
  • Audiovisual, Physical & Tactile Output
  • As illustrated, appendages 220 appear as “cat ears,” but appendages 220 may, by way of example and not limitation, take any zoomorphic form and the inventors contemplate appendages 220 taking the form of any known three-dimensional shape. Also as illustrated, appendages 220 are located, by way of example and not limitation, on top of suitcase 205. However, the inventors contemplate placing appendages 200 any location proximate to suitcase 205. In one embodiment, appendages 220 may be removable and stored in suitcase 205 to allow suitcase 205 to occupy an overall smaller volume profile. In this manner, such a reduced footprint may allow for easier storage in airplane overhead compartments and better compatibility with passenger airline regulations.
  • Appendages 220 may be driven by actuators 275, as shown in top front (cutaway) view 270. Under the power of actuators 275, appendages 220 may “move,” or more specifically, appendages 220 may coapproach or oscillate. By way of example and not limitation, appendages 220 may “twitch,” “wiggle,” or “twist,” in a manner similar to the motion of the ears of a cat. Note, however, that the inventors contemplate any and all movements physically possible of appendages 220 by actuators 220. By way of example and not limitation, such movements may take the form of zoolocomotion. Movement of appendages 220 may occur in reaction to accelerometer/rotation sensor telemetry, handle or tactile sensor 215 data being received and processed by a processor.
  • Display 225 may be lit by a power source (not shown) in suitcase 205. As illustrated, Display 225 takes the form of LEDs deployed in a “heart” formation on suitcase 205, but the inventors contemplate that display 225 may take any shape or be of any number, and be located on any position on suitcase 205. LEDs are illustrated only by way of example and not limitation. Thus, embodiments described herein provide for any known video displays, such as LCD screens and the like. Furthermore, the inventors contemplate the display of any and all visual media on display 225. Furthermore, the inventors contemplate a processing updating or otherwise changing this visual media based on sensory input data as provided herein. In one embodiment, display 225 may illuminate in reaction to sensory input as described herein.
  • In one embodiment, a speaker (not shown) may cause sound in reaction to accelerometer/rotation sensor telemetry, microphone, camera, handle sensor or tactile sensor 215 data being received and processed by a processor as described herein. By way of example and not limitation, the speaker may play music, animal sounds, however, the inventors contemplate any and all audio output known.
  • A vibratory driver (not shown) may be employed to cause vibration of suitcase 205 or parts of the suitcase according to some embodiments described herein. In one embodiment, the vibratory driver may vibrate suitcase 205 in reaction to accelerometer/rotation sensor telemetry, handle sensor or tactile sensor 215 data being received and processed by a processor.
  • By way of example and not limitation, output by appendages 220, display 225, a vibratory driver or speaker may occur in reaction to data from a user “petting” suitcase 205, rolling or shaking suitcase 205, using (e.g., deploying/retracting) handle 255, or any other stimuli. In other words, a user interfacing with suitcase 205 may cause output by appendages 220, display 225, a vibratory driver or speaker to imitate a “reaction” by suitcase 205. While zoolocomotory/zoomimical examples have been provided, the inventors contemplate any and all possible movements as reactions by suitcase 205 to user stimuli. Furthermore, the speed and/or frequency of output by appendages 220, display 225, a vibratory driver or speaker may vary in reaction to user stimuli.
  • In one embodiment, pulling up on handle 255 (i.e., telescoping or deploying) may cause embodiments described herein to appear to “wake up,” e.g., triggering a “heartbeat” on display 225, movement of appendages 220, or causing audiovisual, physical or tactical output as described herein.
  • Power System
  • Suitcase 205 may include one or more of: a power cable (not pictured) or a battery (not pictured) as described herein. One or more wheels 210 may include one or more motor/dynamos (not pictured). Motor/dynamos may recharge a battery (not shown) or provide locomotion to suitcase 205, as described herein. Motor/dynamos may also cause suitcase 205 to move in response to sensory input as described herein. In one embodiment, motory/dynamos may cause suitcase 205 to zoolocomote (by way of example and not limitation: “follow” a user in manner similar to a pet following an owner) based on commands from a processor and/or sensory input as described herein.
  • FIG. 3
  • FIG. 3 illustrates a method for driving audiovisual, physical and/or tactile output based on sensory input, according to one embodiment of the present disclosure. Although the method steps are described in conjunction with FIGS. 1-3, persons skilled in the art will understand that any system configured to perform the method steps, in any order, falls within the scope of the present disclosure. The steps in this method are illustrative only and do not necessarily need to be performed in the given order they are presented herein. Some steps may be omitted completely.
  • The method begins at a step 305, in which sensory input is received by a sensor. In some embodiments, sensors may be tactile sensors, handle sensors, rotational sensors or accelerometers as described herein. By way of example and not limitation, sensory input for a tactile sensor may take the form of touch data, such as if a user were to “pet” the tactile sensor. A handle sensor may send a handle trigger output when, by way of example and not limitation, deployment of a telescoping suitcase handle is detected by a handle sensor. By way of example and not limitation, an accelerometer or a rotational sensor mechanically linked to a wheel may send telemetry related to movement of embodiments described herein.
  • At a step 310, sensory input data is transmitted to a processor. At a step 315, the processor may execute instructions from software stored in memory in response to and/or dependent on sensory input data. By way of example and not limitation, software may take the form of firmware, software loaded in short- or long-term data storage as described herein, or Internet/cloud-stored information.
  • At a step 320, the processor may send commands based on the instructions to an audio/visual physical or tactile output system (output system). In some embodiments, an output system may include one or more of the following: motor/dynamos, vibratory drivers, displays and speakers as described herein. By way of example and not limitation, a command sent by the processor to a motor/dynamo may cause voltage applied to a motor/dynamo, causing a suitcase to zoolocomote, as described herein. By way of example and not limitation, commands to a vibratory driver may cause voltage applied to a vibratory driver, causing a suitcase to zoomimick a cat (e.g., “purring”), as described herein. By way of example and not limitation, commands sent to a speaker may be formed as audio data signals causing a suitcase to zoommick “purring” sounds from a speaker, as described herein. By way of example and not limitation, commands sent to a display may take the form of voltage applied to an LED or video data sent to a display, causing pulsation/illumination of a heart formation of LEDs or fantastical representations of lifelike appearances and animations as described herein.
  • Although embodiments described herein are illustrated and described herein as embodied in one or more specific examples, it is nevertheless not intended to be limited to the details shown, since various modifications and structural changes may be made therein without departing from the spirit of embodiments described herein and within the scope and range of equivalents of the claims. Moreover, this application includes additional images in the attached appendix to the specification. Accordingly, it is appropriate that the appended claims be construed broadly and, in a manner, consistent with the scope of embodiments described herein, as set forth in the following claims.

Claims (12)

1. A container including:
a plurality of wheels,
an extendable handle, coupled to said container;
a processor coupled to the container said processor coupled to a memory device, a motion detector and an actuator;
said memory device operable to hold program instructions directing the processor to perform a method including:
detecting motion and moving an appendage in response to the motion.
2. The container of claim 1 wherein the moving the appendage includes zoolocomotion.
3. The container of claim 1 wherein the appendage is substantially in the form of cat ears.
4. The container of claim 1 wherein the container is a suitcase.
5. The container of claim 4 further including:
a wireless communication system, said wireless communications system coupled to the processor;
wherein the processor and wireless communications system are operable to send and receive information to alter the program instructions.
6. The container of claim 5 wherein the wireless communications system is either Bluetooth or Wi-Fi.
7. The container of claim 1 further including a tactile sensor, said tactile sensor coupled to the processor.
8. The container of claim 1 wherein the motion detector is an accelerometer.
9. A suitcase including:
at least one wheel, said wheel including a rotation sensor;
a processor coupled to the suitcase said processor coupled to a memory device, a tactile sensor, a motion detector, and actuator and the rotation sensor;
said memory device operable to hold program instructions directing the processor to perform a method including one or more of the following:
detecting motion and moving an appendage in response to the motion,
detecting rotation and moving an appendage in response to the rotation,
detecting touch and moving an appendage in response to the touch;
wherein said appendage substantially represents an animal ear;
wherein the moving the appendage includes movement of the animal ear in a substantially zoomical motion.
10. The suitcase of claim 9 further including:
a second appendage,
wherein said moving the appendage includes moving both appendages.
11. A suitcase including:
at least one wheel, said wheel including a rotation sensor;
a substantially transparent handle, said handle dispose on a first surface of the suitcase;
a light source, said light source disposed to illuminate at least a portion of the handle;
a processor coupled to the suitcase said processor coupled to a memory device, a tactile sensor, a motion detector, the light source and the rotation sensor;
said memory device operable to hold program instructions directing the processor to perform a method including one or more of the following:
detecting motion and illuminating the light source in response to the motion,
detecting rotation and illuminating the light source in response to the rotation, or
detecting touch and illuminating the light source in response to the touch.
12. The suitcase of claim 11 wherein the light source includes multi-colored LEDs.
US16/190,526 2016-09-20 2018-11-14 Interactive animate luggage Abandoned US20190231045A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/190,526 US20190231045A1 (en) 2016-09-20 2018-11-14 Interactive animate luggage

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US29/578,325 USD824676S1 (en) 2016-09-20 2016-09-20 Suitcase
US29617217 2017-09-12
US201762587211P 2017-11-16 2017-11-16
US16/190,526 US20190231045A1 (en) 2016-09-20 2018-11-14 Interactive animate luggage

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US29617217 Continuation-In-Part 2016-09-20 2017-09-12

Publications (1)

Publication Number Publication Date
US20190231045A1 true US20190231045A1 (en) 2019-08-01

Family

ID=67392986

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/190,526 Abandoned US20190231045A1 (en) 2016-09-20 2018-11-14 Interactive animate luggage

Country Status (1)

Country Link
US (1) US20190231045A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111399407A (en) * 2020-03-10 2020-07-10 江苏理工学院 Control system and control method of intelligent luggage case
EP3868248A1 (en) * 2020-02-20 2021-08-25 Sega Toys Co., Ltd. Carrying vessel, such as purse, with electrical conduits
US11419399B1 (en) 2021-09-01 2022-08-23 Spin Master Ltd. Storage device with movable element
USD980624S1 (en) 2021-05-31 2023-03-14 Spin Master Ltd. Toy bag
USD1014079S1 (en) 2021-04-22 2024-02-13 Spin Master Ltd. Toy bag

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090209166A1 (en) * 2008-02-19 2009-08-20 Samuel Chen Shipping box toy
US20140107868A1 (en) * 2012-10-15 2014-04-17 Mirko DiGiacomcantonio Self-propelled luggage
US8960959B2 (en) * 2013-03-14 2015-02-24 Lien-Ti Chen Handle assembly for a luggage
US9788619B2 (en) * 2013-04-18 2017-10-17 Robert Dale Beadles Suitcase
US20180116361A1 (en) * 2016-11-03 2018-05-03 Universal Travel Systems Inc. Multifunctional smart luggage carrier
US20190277552A1 (en) * 2018-03-12 2019-09-12 Dac V. Vu Smart delivery package storage container

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090209166A1 (en) * 2008-02-19 2009-08-20 Samuel Chen Shipping box toy
US20140107868A1 (en) * 2012-10-15 2014-04-17 Mirko DiGiacomcantonio Self-propelled luggage
US8960959B2 (en) * 2013-03-14 2015-02-24 Lien-Ti Chen Handle assembly for a luggage
US9788619B2 (en) * 2013-04-18 2017-10-17 Robert Dale Beadles Suitcase
US20180116361A1 (en) * 2016-11-03 2018-05-03 Universal Travel Systems Inc. Multifunctional smart luggage carrier
US20190277552A1 (en) * 2018-03-12 2019-09-12 Dac V. Vu Smart delivery package storage container

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3868248A1 (en) * 2020-02-20 2021-08-25 Sega Toys Co., Ltd. Carrying vessel, such as purse, with electrical conduits
JP7569224B2 (en) 2020-02-20 2024-10-17 株式会社セガフェイブ Carrying container
CN111399407A (en) * 2020-03-10 2020-07-10 江苏理工学院 Control system and control method of intelligent luggage case
USD1014079S1 (en) 2021-04-22 2024-02-13 Spin Master Ltd. Toy bag
USD980624S1 (en) 2021-05-31 2023-03-14 Spin Master Ltd. Toy bag
US11419399B1 (en) 2021-09-01 2022-08-23 Spin Master Ltd. Storage device with movable element

Similar Documents

Publication Publication Date Title
US20190231045A1 (en) Interactive animate luggage
US10792578B2 (en) Interactive plush character system
US20130095725A1 (en) Figurine toy in combination with a portable, removable wireless computer device having a visual display screen
US11000952B2 (en) More endearing robot, method of controlling the same, and non-transitory recording medium
CN105264452B (en) Multipurpose self-advancing device
CN103930182B (en) Baseplate assembly for use with toy pieces
US10589685B1 (en) Portable expandable mirrors with lights for use in motor vehicles and elsewhere
US20230241782A1 (en) Condition-Based Robot Audio Techniques
US20180117762A1 (en) Data exchange system
JP7128842B2 (en) Entertainment systems, robotic devices and server devices
US20150104774A1 (en) Interactive educational system
CN105498228A (en) Intelligent robot learning toy
JP2019115481A (en) Information processing program, information processing system, information processing device, and information processing method
KR101685401B1 (en) Smart toy and service system thereof
US20140378025A1 (en) Simulated Head Apparatus with LCD Display Face
JP7444460B2 (en) charging station for robots
US20220100281A1 (en) Managing states of a gesture recognition device and an interactive casing
GB2331713A (en) Stuffed toys
EP3228426A1 (en) System for content recognition and response action
JP2019111603A (en) Robot, robot control system, robot control method and program
US20210289904A1 (en) Carrying vessel, such as purse, with electrical conduits
CN105137861A (en) Self-propelled device with magnetically coupling
CN207694257U (en) Interactive robot toy and the interacting toys that user's finger can be attached to
US20220297018A1 (en) Robot, robot control method, and storage medium
WO2021149516A1 (en) Autonomous mobile body, information processing method, program, and information processing device

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION