Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/332040.332491acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
Article
Free access

The cubic mouse: a new device for three-dimensional input

Published: 01 April 2000 Publication History

Abstract

We have developed a new input device that allows users to intuitively specify three-dimensional coordinates in graphics applications. The device consists of a cube-shaped box with three perpendicular rods passing through the center and buttons on the top for additional control. The rods represent the X, Y, and Z axes of a given coordinate system. Pushing and pulling the rods specifies constrained motion along the corresponding axes. Embedded within the device is a six degree of freedom tracking sensor, which allows the rods to be continually aligned with a coordinate system located in a virtual world. We have integrated the device into two visualization prototypes for crash engineers and geologists from oil and gas companies. In these systems the Cubic Mouse controls the position and orientation of a virtual model and the rods move three orthogonal cutting or slicing planes through the model. We have evaluated the device with experts from these domains, who were enthusiastic about its ease of use.

References

[1]
Cruz-Neira, C., Sandin, D.J., and DeFanti, T.A. Surround-screen Projection-based Virtual Reality: The Design and Implementation of the CAVE. Proceedings of SIGGRAPH '93, 135-142, 1993.
[2]
Cutler, L. D., Fr6hlich, B., and Hanrahan, P. Two- Handed Direct Manipulation on the Responsive Workbench. 1997 Symposium on Interactive 3D Graphics, 107-114, 1997.
[3]
Guiard, Y. Symmetric division of labor in human skilled bimanual action: the kinematic chain as a model. The Journal of Motor Behaviour, 19(4):486-517, 1987.
[4]
Hinckley, K., Pausch, R., Goble, J.C., and Kassell, N.F.A Survey of Design Issues in Spatial Input. In Proceedings of the A CM Symposium on User Interface Software and Technology, pages 213-222, 1994.
[5]
Hinckley, K., Pausch, R., Goble, J.C., and Kassell, N.F. Passive real-world interface props for neurosurgical visualization. In Proceedings of A CM CHI'94 Conference on Human Factors in Computing Systems, pages 452--458, 1994
[6]
Kriiger, W., Bohn, C.-A., Fr6hlich, B., Schtith, H., Strauss, W., and Wesche, G. The Responsive Workbench. IEEE Computer, 42-48, July 1995
[7]
Kriiger, W., and Fr6hlich B. The Responsive Workbench. IEEE Computer Graphics and Applications, 12- 15, May 1994
[8]
Mine, M.R., Brooks F. P. Jr., and H. Sequin, C. H. Moving Objects in Space: Exploiting Proprioception In Virtual-Environment Interaction. In SIGGRAPH 97 Conference Proceedings, pages 19-26. August 1997
[9]
Pausch, R., Burnette, T., Brockway, D., and Weiblen, M.E. Navigation and locomotion in virtual worlds via flight into Hand-Held miniatures. In SIGGRAPH 95 Conference Proceedings, pages 399-400, August 1995.
[10]
Poston, T., and Serra, L. The Virtual Workbench: Dextrous VR. in Virtual Reality Software and Technology (Proceedings of VRST'94, August 23-26, 1994, Singapore), pages 111-122, August 1994.
[11]
Tramberend, H. Avocado: A Distributed Virtual Reality Framework. Proceedings of VR'99 Conference, Houston, Texas, 14-2 l, March 1999.
[12]
Viega, J., Conway, M. J., Williams G., and Pausch, R. 3D Magic Lenses. In Proceedings of the ACM Symposium on User Interface Software and Technology, Papers: Information Visualization, pages 51-58, 1996.

Cited By

View all
  • (2024)HoberUI: An Exploration of Kinematic Structures as Interactive Input DevicesMultimodal Technologies and Interaction10.3390/mti80200138:2(13)Online publication date: 13-Feb-2024
  • (2024)A Benchmark Dataset for Evaluating Spatial Perception in Multimodal Large ModelsProceedings of the First International Workshop on IoT Datasets for Multi-modal Large Model10.1145/3698385.3699875(37-43)Online publication date: 4-Nov-2024
  • (2024)Exploiting Physical Referent Features as Input for Multidimensional Data Selection in Augmented RealityACM Transactions on Computer-Human Interaction10.1145/364861331:4(1-40)Online publication date: 19-Sep-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI '00: Proceedings of the SIGCHI conference on Human Factors in Computing Systems
April 2000
587 pages
ISBN:1581132166
DOI:10.1145/332040
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 April 2000

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. two-handed interaction
  2. user interface hardware
  3. virtual reality

Qualifiers

  • Article

Conference

CHI00
Sponsor:
CHI00: Human Factors in Computing Systems
April 1 - 6, 2000
The Hague, The Netherlands

Acceptance Rates

CHI '00 Paper Acceptance Rate 72 of 336 submissions, 21%;
Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

Upcoming Conference

CHI '25
CHI Conference on Human Factors in Computing Systems
April 26 - May 1, 2025
Yokohama , Japan

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)133
  • Downloads (Last 6 weeks)22
Reflects downloads up to 09 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)HoberUI: An Exploration of Kinematic Structures as Interactive Input DevicesMultimodal Technologies and Interaction10.3390/mti80200138:2(13)Online publication date: 13-Feb-2024
  • (2024)A Benchmark Dataset for Evaluating Spatial Perception in Multimodal Large ModelsProceedings of the First International Workshop on IoT Datasets for Multi-modal Large Model10.1145/3698385.3699875(37-43)Online publication date: 4-Nov-2024
  • (2024)Exploiting Physical Referent Features as Input for Multidimensional Data Selection in Augmented RealityACM Transactions on Computer-Human Interaction10.1145/364861331:4(1-40)Online publication date: 19-Sep-2024
  • (2024)How Learners’ Visuospatial Ability and Different Ways of Changing the Perspective Influence Learning About Movements in Desktop and Immersive Virtual Reality EnvironmentsEducational Psychology Review10.1007/s10648-024-09895-w36:3Online publication date: 22-Jun-2024
  • (2023)A cyber-physical system to design 3D models using mixed reality technologies and deep learning for additive manufacturingPLOS ONE10.1371/journal.pone.028920718:7(e0289207)Online publication date: 27-Jul-2023
  • (2023)SurfAirs: Surface + Mid-air Input for Large Vertical DisplaysProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580877(1-15)Online publication date: 19-Apr-2023
  • (2023)The Impact of Fog computing in the IoT World2023 Global Conference on Wireless and Optical Technologies (GCWOT)10.1109/GCWOT57803.2023.10064669(1-7)Online publication date: 24-Jan-2023
  • (2023)Mixed Reality Interaction TechniquesSpringer Handbook of Augmented Reality10.1007/978-3-030-67822-7_5(109-129)Online publication date: 1-Jan-2023
  • (2022)Composites: A Tangible Interaction Paradigm for Visual Data Analysis in Design PracticeProceedings of the 2022 International Conference on Advanced Visual Interfaces10.1145/3531073.3531091(1-9)Online publication date: 6-Jun-2022
  • (2022)A Survey on Cross‐Virtuality AnalyticsComputer Graphics Forum10.1111/cgf.1444741:1(465-494)Online publication date: 8-Feb-2022
  • Show More Cited By

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media