US20110169826A1 - Universal collaborative pseudo-realistic viewer - Google Patents
Universal collaborative pseudo-realistic viewer Download PDFInfo
- Publication number
- US20110169826A1 US20110169826A1 US13/120,721 US200913120721A US2011169826A1 US 20110169826 A1 US20110169826 A1 US 20110169826A1 US 200913120721 A US200913120721 A US 200913120721A US 2011169826 A1 US2011169826 A1 US 2011169826A1
- Authority
- US
- United States
- Prior art keywords
- pseudo
- scene
- realistic image
- rendered
- view
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/005—General purpose rendering architectures
Definitions
- the invention relates generally to the field of visual modeling, and in particular to a method and apparatus providing collaborative pseudo-realistic rendering of a visual model responsive to user inputs.
- Building information modeling is the process of generating and managing building data during its life cycle. Typically it uses three-dimensional, dynamic building modeling software to increase productivity in building design and construction.
- the term building design and construction is not limited to physical dwellings and/or offices, but is meant to additionally include any construction project including, without limitation, road and infrastructure projects.
- the process produces a Building Information Model (BIM), which as used herein comprises building geometry, spatial relationships, geographic information, and quantities and properties of building components, irrespective of whether we are dealing with a physical building or a general construction project including land development and infrastructure.
- BIM Building Information Model
- 3D visualization applications provide photo-realistic results using techniques such as ray tracing, radiosity, global illumination and other shading, shadowing and light reflection techniques.
- Such 3D visualization applications provide a 3D generated model, without relationship to the existing environment.
- Rapid Design Visualization is a software application available from RDV Systems, Ltd. of Lod, ISRAEL, which enables any Civil 3D user to create a fully interactive visualization environment directly from their Civil 3D project.
- Civil 3D is a software BIM solution for the field of civil engineering available from Autodesk, Inc. of San Rafael, Calif.
- the Rapid Design Visualization software enables a Civil 3D designer to easily create drive through simulations, flyovers and interactive simulations for proposed roads, subdivisions, underground infrastructure, interchanges and many other complex land development projects. Such an interactive simulation enables a potential user, developer, or investor, to visualize a Civil 3D project in an office environment.
- Freewheel software available from Autodesk, Inc. of San Rafael, Calif., provides the ability to upload design data to a remote server. Any user allowed access can then utilize a variety of navigation tools to review the design.
- no software download is required, since the requirement to download and install software is an often unbridgeable barrier in today's IT environment. Disadvantageously, collaborative tools are not provided.
- a server comprising a computer-readable medium containing instructions for controlling an electronic device to perform a method of visualization, the method comprising: loading a 3 dimensional (3D) scene comprising visual model data; rendering a first pseudo-realistic image of the loaded 3D scene responsive to a first view positional indicator; transmitting the rendered first pseudo-realistic image to at least two remote computing platforms; receiving from any of the at least two remote computing platforms a scene control command; rendering a second pseudo-realistic image of the loaded 3D scene responsive to the received scene control command; and transmitting the rendered second pseudo-realistic image to the at least two remote computing platforms.
- 3D 3 dimensional
- the server is arranged to provide a full active collaborative interactive viewing experience between disparate devices.
- active collaboration is meant to include wherein various users may simultaneously interact with the 3D model, with interaction displayed to all of the various users.
- one of the computing platforms is a cellular telephone, and another of the computing platforms is a portable computer. Either of the devices can provide input to the shared collaborative viewing experience.
- a third computing platform is provided, the third computing platform arranged to generate the 3D scene internally responsive to positional indicator information.
- a computer-readable medium containing instructions for controlling an electronic device to perform a method of visualization comprising: loading a 3 dimensional (3D) scene comprising visual model data; rendering a first pseudo-realistic image of the loaded 3D scene responsive to a first view positional indicator; transmitting the rendered first pseudo-realistic image to at least two remote computing platforms; receiving from any of the at least two remote computing platforms a scene control command; rendering a second pseudo-realistic image of the loaded 3D scene responsive to the received scene control command; and transmitting the rendered second pseudo-realistic image to the at least two remote computing platforms.
- 3D 3 dimensional
- the scene control command comprises a second view positional indicator different from the first view positional indicator.
- the method further comprises: performing an analysis of at least one criteria of the visual model data responsive to the each of the first and second positional indicators; and transmitting at least one result of the performed analysis in concert with the respective transmitted rendered first and second pseudo-realistic images.
- the scene control command comprises one of turning off at least one object of the 3D scene, changing illumination of at least a portion of the 3D scene and changing a material type for at least one object of the 3D scene.
- the scene control command comprises a highlight indicator.
- the first pseudo-realistic image exhibits an adjustable field of view
- the rendered first pseudo-realistic image presents a view frustum responsive to the first view positional indicator and the adjustable field of view.
- the rendering of the first and second pseudo-realistic image comprises at least two of shading, texturing, illumination and shadowing responsive to real time orientation information in respect to latitude, longitude and elevation.
- At least one of the transmitted rendered first pseudo-realistic image and the transmitted rendered second pseudo-realistic image comprises an omni-directional view.
- the visual model data is provided in a selectable one of a plurality of formats.
- the computer readable medium is embeddable into a web site.
- a server comprising a computing device and a communication module
- the computing device arranged to: load a 3 dimensional (3D) scene comprising visual model data; render a first pseudo-realistic image of the loaded 3D scene responsive to a first view positional indicator; transmit the rendered first pseudo-realistic image via the communication module to at least two remote computing platforms; receive, via the communication module, from any of the at least two remote computing platform a scene control command; render a second pseudo-realistic image of the loaded 3D scene responsive to the received scene control command; and transmit the rendered second pseudo-realistic image to the at least two remote computing platforms.
- 3D 3 dimensional
- the scene control command comprises a second view positional indicator different from the first view positional indicator.
- the computing device is further arranged to: perform an analysis of at least one criteria of the visual model data responsive to each of the first and second positional indicators; and transmit at least one result of the performed analysis to the at least two remote computing platforms in concert with the respective transmitted rendered first and second pseudo-realistic images.
- the scene control command comprises one of: turning off at least one object of the 3D scene, changing illumination of at least a portion of the 3D scene and changing a material type for at least one of object of the 3D scene.
- the scene control command comprises a highlight indicator.
- the first pseudo-realistic image exhibits an adjustable field of view
- the rendered first pseudo-realistic image presents a view frustum responsive to the first view positional indicator and the adjustable field of view.
- the rendering of the first and second pseudo-realistic image comprises at least two of shading, texturing, illumination and shadowing responsive to real time orientation information in respect to latitude, longitude and elevation.
- At least one of the transmitted rendered first pseudo-realistic image and the transmitted rendered second pseudo-realistic image comprises an omni-directional view.
- the visual model data is provided in a selectable one of a plurality of formats.
- a computer-readable medium containing instructions for controlling an electronic device to perform a method of visualization comprising: loading a 3 dimensional (3D) scene comprising visual model data; rendering a pseudo-realistic image of the loaded 3 D scene responsive to a first view positional indicator, the pseudo-realistic image comprising an omni-directional view; and transmitting the rendered pseudo-realistic image to at least two remote computing platforms.
- 3D 3 dimensional
- FIG. 1 illustrates a high level block diagram of a system providing universal collaborative visualization in accordance with an exemplary embodiment
- FIG. 2 illustrates a rendered pseudo-realistic image in which a plurality of highlights, or indicators, each associated with a particular device of FIG. 1 is depicted;
- FIG. 3 illustrates a high level flow chart of a method of operation of the server of FIG. 1 to perform a method of visualization
- FIG. 4 illustrates a high level flow chart of a method of operation of the server of FIG. 1 to perform a method of visualization comprising an omni-directional view on demand.
- FIG. 1 illustrates a high level block diagram of a system 10 providing universal collaborative visualization in accordance with an exemplary embodiment, system 10 comprising: a server 20 comprising a processor 30 , a memory 40 , a communication module 50 and a 3 D scene storage 60 ; a network 70 ; a mobile computing platform 80 comprising a processor 30 , a communication module 50 , a display 90 and a user input device 100 ; a real time position determining device 110 ; a computer 120 comprising a processor 30 , a communication module 50 , a display 90 and a user input device 100 ; and a mobile station 150 comprising a processor 30 , a communication module 50 , a display 90 and a user input device 100 .
- Processor 30 of server 20 is in communication with memory 40 , 3D scene storage 60 and communication module 50 of server 20 .
- Real time position determining device 110 is in communication with mobile computing platform 80 , and in particular with processor 30 thereof.
- Processor 30 of mobile computing platform 80 is in communication with each of communication module 50 , display 90 and user input device 100 thereof.
- Processor 30 of computer 120 is in communication with each of communication module 50 , display 90 and user input device 100 thereof.
- Processor 30 of mobile station 150 is in communication with each of communication module 50 , display 90 and user input device 100 thereof.
- network 70 is a combination of a General Packet Radio Service (GPRS) and the Internet, however this is not meant to be limiting in any way.
- GPRS General Packet Radio Service
- Network 70 generally comprises a communication network arranged to enable electronic communication between server 20 and each of mobile computing platform 80 , computer 120 and mobile station 150 , via the respective communication module 50 .
- the action of each user input device 100 is preferably communicated to server 20 , and particularly to processor 30 of server 20 via the respective communication module 50 .
- mobile station 150 comprises a cellular telephone, a personal digital assistant, or a hand held computer, each of which meets the definition of a computing device.
- Server 20 is illustrated as a stand alone server, however this is not meant to be limiting in any way.
- server 20 is implemented on a cloud computing platform, in which a dynamically scalable and often virtualized resource is provided as a service over the Internet.
- operation of the method of server 20 is embeddable in a web site.
- Mobile station 150 is an example of a limiting viewing device, since mobile station 150 is incapable of rendering a pseudo-realistic view of a complex 3D scene.
- processor 30 of server 20 is operative responsive to computer readable instructions stored on memory 40 to load a 3D scene comprising virtual model data from 3D scene storage 60 .
- the virtual model data is preferably supported in a plurality of formats.
- the 3D scene comprises Building Information Model (BIM) data.
- Processor 30 of server 20 is further operative to render a pseudo-realistic image of the loaded 3D scene, responsive to a first view positional indicator.
- the first view positional indicator is a default positional indicator.
- the rendered pseudo-realistic image exhibits an adjustable field of view, and the rendered pseudo-realistic image presents a view frustum responsive to the first view positional indicator.
- the rendering of the pseudo-realistic image comprises at least two of shading, texturing, illumination and shadowing responsive to real time orientation information in respect to latitude, longitude and elevation.
- the rendered pseudo-realistic image is preferably stored in memory 40 , is preferably further associated with a session ID, and is transmitted to the source of the received command via the respective communication modules 50 and network 70 .
- the first one of mobile computing platform 80 , computer 120 and mobile station 150 which transmitted the command the initiating client.
- the 3D scene is rendered in accordance with the teachings of U.S. patent application Ser. No. 11/538,103 to Elsberg et al, entitled “Method and Apparatus for Virtual Reality Presentation of Civil Engineering, Land Planning and Infrastructure”, published as US 2007/0078636 A1, incorporated above by reference.
- the 3D scene is developed via photogrammetry, from existing architectural plans and land survey information, via light detecting and ranging (LIDAR) and/or from existing or developed geographic information system (GIS) data.
- LIDAR light detecting and ranging
- GIS geographic information system
- the initiating client is further provided with the session ID.
- the initiating client shares the session ID with another one or more of mobile computing platform 80 , computer 120 and mobile station 150 , such as by e-mail, SMS or any other form of communication.
- any of mobile computing platform 80 , computer 120 and mobile station 150 may link to server 20 .
- each of mobile computing platform 80 , computer 120 and mobile station 150 is provided with the opportunity to download and install software which will enable on-board generation of the pseudo-realistic images, or alternatively to avoid installation of software. It is known to those skilled in the art that installation of software often requires privileges which may not be easily obtained by the average corporate user, and thus the ability to avoid installation of software while maintaining a collaborative viewing experience is particularly advantageous.
- Server 20 responsive to the provided session ID received from any of mobile computing platform 80 , computer 120 and mobile station 150 , is operative to transmit the rendered pseudo-realistic image, as indicated above preferably stored in memory 40 , associated with the provided session ID, to the source of the provided session ID.
- the rendering pseudo-realistic image as indicated above preferably stored in memory 40 , associated with the provided session ID, to the source of the provided session ID.
- a user of any of mobile computing platform 80 , computer 120 and mobile station 150 now interacts via the respective user input device 100 to request a scene control, such as one or more of turning off at least one object of said 3D scene, changing the transparency of at least one object of said 3D scene, changing illumination of at least a portion of said 3D scene and changing a material type for at least one object of said 3D scene.
- turning of at least one element comprises adjusting the transparency of the at least one element to 100%. Adjusting the transparency, or turning off of at least one element, provides the user with extraordinary visual perception. Alternatively, sight conditions of the display 3D scene may be adjusted so as to provide a simulation of reduced visibility conditions such as fog.
- processor 30 of server 20 Responsive to the received request for a second view positional indicator or other scene control, processor 30 of server 20 is operative to render a second pseudo-realistic image of the loaded 3D scene.
- the second rendered pseudo-realistic image is preferably stored in a cache portion of memory 40 , preferably further associated with the session ID, and is transmitted via the respective communication modules 50 and network 70 to each of mobile computing platform 80 , computer 120 and mobile station 150 , connected via the provided session ID for display on the respective display 90 .
- the second rendered pseudo-realistic image is immediately transmitted only to the initiating client, and one or more additional clients are maintained in a loop requesting at each iteration if there is an updated image in the memory 40 associated with the present session ID.
- processor 30 of server 20 is arranged to transmit the updated image via the respective communication module 50 and network 70 .
- any of mobile computing platform 80 , computer 120 and mobile station 150 have downloaded and installed software which enables on-board generation of the pseudo-realistic images, as described above, and is available from, inter alia, RDV Systems of Lod, Israel
- the 3D scene loaded by processor 30 of server 20 is further provided to the one of mobile computing platform 80 , computer 120 and mobile station 150 that downloaded and installed the software, thus the pseudo-realistic view is locally generated by the local processor 30 for display on the respective display 90 .
- the above mentioned received request for a second view positional indicator or other scene control is transmitted by processor 30 of server 20 to the one or more of mobile computing platform 80 , computer 120 and mobile station 150 which has downloaded and installed the image generation software.
- receipt of the actual rendered pseudo-realistic image is not required by the one or more of mobile computing platform 80 , computer 120 and mobile station 150 which has downloaded and installed the image generation software, since the pseudo-realistic image is generated locally by the respective processor 30 for display on the respective display 90 .
- real time positioning determining device 110 changes in the physical position of real time positioning determining device 110 are transmitted to server 20 via the respective communication module 50 and network 70 as a scene control command.
- changes in the physical position of real time positioning determining device 110 are transmitted to server 20 via the respective communication module 50 and network 70 as a scene control command.
- only the actual position of real time positioning determining device 110 is highlighted, or otherwise indicated on the rendered pseudo-realistic image transmitted by server 20 and received by any of mobile computing platform 80 , computer 120 and mobile station 150 connected by the provided session ID.
- the pseudo-realistic image is rendered further responsive to chronographic information associated with real time positioning determining device 110 .
- the rendered pseudo-realistic image exhibits shadowing responsive to a calculated position of the sun; correct for the latitude, longitude, elevation and local time received from real time position determining device 110 .
- the 3 D scene comprises at least one dynamic object, whose motion may optionally be set to be fixed.
- a vehicle having a predetermined speed of travel may be displayed in the pseudo-realistic scene.
- the user's actual travel in the real world, as indicated by real time positioning determining device 110 , matches the predetermined speed the user will be seen to be maintaining pace in relation to the dynamic object vehicle.
- any one or more of mobile computing platform 80 , computer 120 and mobile station 150 connected by the provided session ID may indicate a point of interest via the respective user input device 100 .
- the coordinates of the indicated point of interest are transmitted to server 20 via the respective communication module 50 and network 70 , and the rendered pseudo-realistic image is updated with a highlight, or other indicator.
- the highlight or other indicator is unique to the session ID/particular one of mobile computing platform 80 , computer 120 and mobile station 150 connected by the provided session ID, such as a by providing a particular color for each of mobile computing platform 80 , computer 120 and mobile station 150 connected by the provided session ID.
- FIG. 2 the above is further illustrated, in which a plurality of highlights, or indicators, 200 , each associated with a particular one of mobile computing platform 80 , computer 120 and mobile station 150 are illustrated.
- Each of the indicators are different, with a first one of indicators 200 being marked by a vertical/horizontal cross hatch, a second one of indicators 200 being marked by a diagonal cross hatch and a third one of indicators 200 being marked by a diagonal pattern.
- the coordinates of the indicated point of interest are transmitted to server 20 along with information regarding the width and height of the respective display 90 of the one of mobile computing platform 80 , computer 120 and mobile station 150 connected by the provided session ID indicating the point of interest via the respective user input device 100 .
- the coordinates are associated with the highlight, or other indicator, and transmitted to each of mobile computing platform 80 , computer 120 and mobile station 150 connected by the provided session ID, which are operative to interpolate the position of the highlight, or other indicator based on the relative dimensions of the respective display 90 associated with the one of mobile computing platform 80 , computer 120 and mobile station 150 indicating the point of interest and the dimensions of the display 90 of the receiving one of mobile computing platform 80 , computer 120 and mobile station 150 .
- the highlight or other indicator is displayed at the appropriate location of the rendered pseudo-realistic image at the receiving one of mobile computing platform 80 , computer 120 and mobile station 150 connected by the provided session ID indicating the point of interest.
- the transmitted rendered pseudo-realistic image is transmitted with an omni-directional view.
- server 20 generates an image that can be applied to a spherical mapping, which represents the view of the rendered pseudo-realistic image in all directions as seen from the current view position indicator.
- the transmitted images are mapped by a respective processor 30 of any of mobile computing platform 80 , computer 120 and mobile station 150 connected by the provided session ID on a sphere surrounding the current view position.
- any of mobile computing platform 80 , computer 120 and mobile station 150 connected by the provided session ID are able to rotate the direction of the view and look at any portion without requiring further downloaded information.
- processor 30 of server 20 is further operative to perform an analysis of at least one criteria of the visual model data loaded from 3D scene storage 60 , responsive to each view positional indicator.
- Processor 30 of server 20 is further operative to transmit at least one result of the performed analysis in concert with the transmitted rendered pseudo-realistic images.
- the criteria is sight distance, as described in published U.S. Patent Application S/N US 2008 / 0021680 published Jan. 24, 2008 to Elsberg et al., entitled “Method and Apparatus for Evaluating Sight Distance”, the entire contents of which is incorporated herein by reference.
- the above system 10 thus enables active collaboration in real time between disparate users, which may be geographically widely separated.
- voice communication, or chat communication is further enabled between the actively collaborating users, to enable real time communication and collaboration.
- the one of mobile computing platform 80 , computer 120 and mobile station 150 initiating the scene command is further provided with an indication of the success of transmission to all other users.
- FIG. 3 illustrates a high level flow chart of the method of operation of server 20 of FIG. 1 to perform a method of visualization.
- a 3D scene comprising virtual model data is loaded.
- the 3D scene is stored in one of a large plurality of formats.
- the 3D scene comprises Building Information Model (BIM) data.
- BIM Building Information Model
- a first pseudo-realistic image of the loaded 3D scene is rendered, responsive to a first view positional indicator.
- the first view positional indicator is a default positional indicator.
- the rendered pseudo-realistic image exhibits an adjustable field of view, and the rendered pseudo-realistic image presents a view frustum responsive to the first view positional indicator.
- the rendering of the pseudo-realistic image comprises at least two of shading, texturing, illumination and shadowing responsive to real time orientation information in respect to latitude, longitude and elevation, as described further below in relation to stage 1070 .
- the 3D scene is rendered in accordance with the teachings of U.S. patent application Ser. No. 11/538,103 to Elsberg et al, entitled “Method and Apparatus for Virtual Reality Presentation of Civil Engineering, Land Planning and Infrastructure”, published as US 2007/0078636 A1, incorporated above by reference.
- the 3D scene is developed via photogrammetry, from existing architectural plans and land survey information, via light detecting and ranging (LIDAR) and/or from existing or developed geographic information system (GIS) data.
- LIDAR light detecting and ranging
- GIS geographic information system
- stage 1020 the rendered first pseudo-realistic image of stage 1010 is transmitted to at least two remote computing platforms.
- coordination between the various remote computing platforms is accomplished responsive to a session ID.
- the loaded 3D scene of stage 1000 is further transmitted.
- a scene control command generated at any of the at least two remote computing platforms, is received.
- the scene control command is in one embodiment a second view positional indicator, different from the first view positional indicator of stage 1010 .
- the scene control command comprises one or more of turning off at least one object of said 3D scene, changing the transparency of at least one object of said 3D scene, changing illumination of at least a portion of said 3D scene and changing a material type for at least one object of said 3D scene.
- turning off at least one element comprises adjusting the transparency of the at least one element to 100%. Adjusting the transparency, or turning off of at least one element, provides the user with extraordinary visual perception. Alternatively, sight conditions of the display 3D scene may be adjusted so as to provide a simulation of reduced visibility conditions such as fog.
- the scene control command comprises a highlight indicator.
- stage 1040 responsive to the received scene control command of stage 1030 , a second pseudo-realistic image is rendered, in a manner in all respects similar to that described above in relation to stage 1010 .
- stage 1050 the rendered second pseudo-realistic image of stage 1040 is transmitted to the at least two remote computing platforms.
- coordination between the various remote computing platforms is accomplished responsive to a session ID.
- the received scene control command is further transmitted, so that computing platforms which have downloaded the image rendering software can locally render the second pseudo-realistic image.
- an analysis of at least one criterion of the visual model data loaded comprising the 3D scene of stage 1000 is performed, responsive to each view positional indicator, or scene control command.
- at least one result of the performed analysis is transmitted in concert with the transmitted rendered pseudo-realistic images.
- the rendering of the pseudo-realistic image comprises at least two of shading, texturing, illumination and shadowing responsive to real time orientation information in respect to latitude, longitude and elevation.
- at least one of the remote computing platforms is associated with a real time positioning device and/or chronographic information, and the rendering is responsive to information from at least one of the real time positioning device and chronographic information.
- the chronographic information is developed in a separate device from the computing platform associated with the real time positioning device.
- the rendered pseudo-realistic image exhibits shadowing responsive to a calculated position of the sun; correct for the latitude, longitude, elevation and local time received from the real time positioning device.
- At least one of the transmitted rendered images of stage 1020 and 1050 is transmitted with an omni-directional view.
- an image is generated that can be applied to a spherical mapping, which represents the view of the rendered pseudo-realistic image in all directions as seen from the current view position indicator.
- the above method is preferably further embeddable in a web site.
- FIG. 4 illustrates a high level flow chart of a method of operation of server 20 of FIG. 1 to perform a method of visualization comprising an omni-directional view on demand.
- a 3D scene comprising virtual model data is loaded.
- the 3D scene is stored in one of a large plurality of formats.
- the 3D scene comprises Building Information Model (BIM) data.
- BIM Building Information Model
- a first pseudo-realistic image of the loaded 3D scene is rendered, responsive to a first view positional indicator.
- the first view positional indicator is a default positional indicator.
- the rendered pseudo-realistic image exhibits an adjustable field of view, and the rendered pseudo-realistic image presents a view frustum responsive to the first view positional indicator.
- the rendering of the pseudo-realistic image comprises at least two of shading, texturing, illumination and shadowing responsive to real time orientation information in respect to latitude, longitude and elevation, as described further above in relation to stage 1070 of FIG. 3 .
- the 3D scene is rendered in accordance with the teachings of U.S. patent application Ser. No. 11/538,103 to Elsberg et al, entitled “Method and Apparatus for Virtual Reality Presentation of Civil Engineering, Land Planning and Infrastructure”, published as US 2007/0078636 A1, incorporated above by reference.
- the 3D scene is developed via photogrammetry, from existing architectural plans and land survey information, via light detecting and ranging (LIDAR) and/or from existing or developed geographic information system (GIS) data.
- LIDAR light detecting and ranging
- GIS geographic information system
- the rendered first pseudo-realistic image of stage 2010 is transmitted to a remote computing platform.
- the remote computing platform comprises one of a mobile computer, a fixed workstation, a cellular telephone, a personal digital assistant and a hand held computer.
- a scene control command generated by the remote compute platform is received.
- the scene control command is in one embodiment a second view positional indicator, different from the first view positional indicator of stage 2010 .
- the scene control command comprises one or more of turning off at least one object of said 3D scene, changing the transparency of at least one object of said 3D scene, changing illumination of at least a portion of said 3D scene and changing a material type for at least one object of said 3D scene.
- turning off at least one element comprises adjusting the transparency of the at least one element to 100%. Adjusting the transparency, or turning off of at least one element, provides the user with extraordinary visual perception. Alternatively, sight conditions of the display 3D scene may be adjusted so as to provide a simulation of reduced visibility conditions such as fog. In yet another embodiment, the scene control command comprises a highlight indicator.
- stage 2040 responsive to the received scene control command of stage 2030 , a second pseudo-realistic image is rendered, in a manner in all respects similar to that described above in relation to stage 2010 .
- stage 2050 the rendered second pseudo-realistic image of stage 2040 is transmitted to the remote computing platform.
- an animation request is received from the remote computing platform.
- the animation request is associated with a predefined animated camera path associated with the rendered 3D scene of stage 2000 .
- the animation is generated and streamed to the remote computing platform.
- the animation is pre-generated and indexed with positional information as a relation to timestamps in the animation.
- stage 2070 a request for an omni-directional view is received from the remote computing platform associated with a particular view positional indicator.
- the view positional indicator is determined responsive to a stop, or pause, request received at a position in the streamed animation of stage 2060 .
- an omni-directional view is generated at the particular view positional indicator of stage 2070 , by rendering a plurality of views representing the 3D scene in all directions at the particular view positional indicator of stage 2070 .
- the rendered plurality of views representing the generated omni-directional view at the particular view positional indicator of stage 2070 is transmitted to the remote computing platform.
- certain of the present embodiments enable an electronic device to perform a method of visualization, the method comprising: loading a 3 dimensional (3D) scene comprising visual model data; rendering a first pseudo-realistic image of the loaded 3D scene responsive to a first view positional indicator; transmitting the rendered first pseudo-realistic image to at least two remote computing platforms; receiving from any of the at least two remote computing platforms a scene control command; rendering a second pseudo-realistic image of the loaded 3D scene responsive to the received scene control command; and transmitting the rendered second pseudo-realistic image to the at least two remote computing platforms.
- 3D 3 dimensional
- the server is thereby arranged to provide a collaborative interactive viewing experience between disparate devices.
- one of the computing platforms is a cellular telephone, and another of the computing platforms is a portable computer. Either of the devices can provide input to the shared collaborative viewing experience.
- a third computing platform is provided, the third computing platform arranged to generate the 3D scene internally responsive to positional indicator information.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Processing Or Creating Images (AREA)
- Instructional Devices (AREA)
Abstract
A computer-readable medium containing instructions for controlling an electronic device to perform a method of visualization, the method constituted of: loading a 3 dimensional (3D) scene comprising visual model data; rendering a first pseudo-realistic image of the loaded 3D scene responsive to a first view positional indicator; transmitting the rendered first pseudo-realistic image to at least two remote computing platforms; receiving from any of the at least two remote computing platform a scene control command; rendering a second pseudo-realistic image of the loaded 3D scene responsive to the received scene control command; and transmitting the rendered second pseudo-realistic image to the at least two remote computing platforms.
Description
- This application claims priority from U.S. Provisional Patent Application 61/100,734 filed Sep. 28, 2008, entitled “Pseudo-Realistic Rendering of BIM Data Responsive to Positional Indicator”, the entire contents of which is incorporated herein by reference.
- The invention relates generally to the field of visual modeling, and in particular to a method and apparatus providing collaborative pseudo-realistic rendering of a visual model responsive to user inputs.
- Building information modeling is the process of generating and managing building data during its life cycle. Typically it uses three-dimensional, dynamic building modeling software to increase productivity in building design and construction. The term building design and construction is not limited to physical dwellings and/or offices, but is meant to additionally include any construction project including, without limitation, road and infrastructure projects. The process produces a Building Information Model (BIM), which as used herein comprises building geometry, spatial relationships, geographic information, and quantities and properties of building components, irrespective of whether we are dealing with a physical building or a general construction project including land development and infrastructure.
- The use of interactive and dynamic 3D computer graphics is becoming prevalent in the computing world. Typically, 3D visualization applications provide photo-realistic results using techniques such as ray tracing, radiosity, global illumination and other shading, shadowing and light reflection techniques. Such 3D visualization applications provide a 3D generated model, without relationship to the existing environment.
- U.S. patent application Ser. No. 11/538,103 to Elsberg et al, entitled “Method and Apparatus for Virtual Reality Presentation of Civil Engineering, Land Planning and Infrastructure”, published as US 2007/0078636 A1, the entire contents of which is incorporated herein by reference, is addressed to a computer implemented method of visualizing an infrastructure, in which the rendering is accomplished in cooperation with a material definition. Such a method allows for evaluating large scale designs in a virtual reality environment, in which the virtual reality rendering exhibits a pseudo-realistic image, defined herein as an image which comprises at least one of shading, texturing, illumination and shadowing based on real world parameters.
- Rapid Design Visualization is a software application available from RDV Systems, Ltd. of Lod, ISRAEL, which enables any Civil 3D user to create a fully interactive visualization environment directly from their Civil 3D project. Civil 3D is a software BIM solution for the field of civil engineering available from Autodesk, Inc. of San Rafael, Calif. The Rapid Design Visualization software enables a Civil 3D designer to easily create drive through simulations, flyovers and interactive simulations for proposed roads, subdivisions, underground infrastructure, interchanges and many other complex land development projects. Such an interactive simulation enables a potential user, developer, or investor, to visualize a Civil 3D project in an office environment.
- The above discussion has been primarily focused on BIM applications, however this is not meant to be limiting in any way, and is equally applicable to any visual model data.
- Freewheel software available from Autodesk, Inc. of San Rafael, Calif., provides the ability to upload design data to a remote server. Any user allowed access can then utilize a variety of navigation tools to review the design. Advantageously, no software download is required, since the requirement to download and install software is an often unbridgeable barrier in today's IT environment. Disadvantageously, collaborative tools are not provided.
- What is desired, and not provided by the prior art is a method and apparatus providing full active collaboration for any visual model, preferably without requiring a software download and installation.
- Accordingly, it is a principal object of the present invention to overcome at least some of the disadvantages of prior art collaborative visualization techniques. This is accomplished in certain embodiments by providing a server comprising a computer-readable medium containing instructions for controlling an electronic device to perform a method of visualization, the method comprising: loading a 3 dimensional (3D) scene comprising visual model data; rendering a first pseudo-realistic image of the loaded 3D scene responsive to a first view positional indicator; transmitting the rendered first pseudo-realistic image to at least two remote computing platforms; receiving from any of the at least two remote computing platforms a scene control command; rendering a second pseudo-realistic image of the loaded 3D scene responsive to the received scene control command; and transmitting the rendered second pseudo-realistic image to the at least two remote computing platforms.
- Thus, the server is arranged to provide a full active collaborative interactive viewing experience between disparate devices. The term active collaboration is meant to include wherein various users may simultaneously interact with the 3D model, with interaction displayed to all of the various users. In an exemplary embodiment, one of the computing platforms is a cellular telephone, and another of the computing platforms is a portable computer. Either of the devices can provide input to the shared collaborative viewing experience. In one particular embodiment, a third computing platform is provided, the third computing platform arranged to generate the 3D scene internally responsive to positional indicator information.
- In certain embodiments a computer-readable medium containing instructions for controlling an electronic device to perform a method of visualization is provided, the method comprising: loading a 3 dimensional (3D) scene comprising visual model data; rendering a first pseudo-realistic image of the loaded 3D scene responsive to a first view positional indicator; transmitting the rendered first pseudo-realistic image to at least two remote computing platforms; receiving from any of the at least two remote computing platforms a scene control command; rendering a second pseudo-realistic image of the loaded 3D scene responsive to the received scene control command; and transmitting the rendered second pseudo-realistic image to the at least two remote computing platforms.
- In certain further embodiments the scene control command comprises a second view positional indicator different from the first view positional indicator. In certain yet further embodiments, the method further comprises: performing an analysis of at least one criteria of the visual model data responsive to the each of the first and second positional indicators; and transmitting at least one result of the performed analysis in concert with the respective transmitted rendered first and second pseudo-realistic images.
- In certain embodiments the scene control command comprises one of turning off at least one object of the 3D scene, changing illumination of at least a portion of the 3D scene and changing a material type for at least one object of the 3D scene. In yet other certain embodiments the scene control command comprises a highlight indicator.
- In certain embodiments the first pseudo-realistic image exhibits an adjustable field of view, and wherein the rendered first pseudo-realistic image presents a view frustum responsive to the first view positional indicator and the adjustable field of view. In other certain embodiments the rendering of the first and second pseudo-realistic image comprises at least two of shading, texturing, illumination and shadowing responsive to real time orientation information in respect to latitude, longitude and elevation.
- In certain embodiments at least one of the transmitted rendered first pseudo-realistic image and the transmitted rendered second pseudo-realistic image comprises an omni-directional view. In other certain embodiments the visual model data is provided in a selectable one of a plurality of formats. In yet other certain embodiments the computer readable medium is embeddable into a web site.
- Independently a server comprising a computing device and a communication module is enabled, the computing device arranged to: load a 3 dimensional (3D) scene comprising visual model data; render a first pseudo-realistic image of the loaded 3D scene responsive to a first view positional indicator; transmit the rendered first pseudo-realistic image via the communication module to at least two remote computing platforms; receive, via the communication module, from any of the at least two remote computing platform a scene control command; render a second pseudo-realistic image of the loaded 3D scene responsive to the received scene control command; and transmit the rendered second pseudo-realistic image to the at least two remote computing platforms.
- In certain embodiments the scene control command comprises a second view positional indicator different from the first view positional indicator. In certain further embodiments the computing device is further arranged to: perform an analysis of at least one criteria of the visual model data responsive to each of the first and second positional indicators; and transmit at least one result of the performed analysis to the at least two remote computing platforms in concert with the respective transmitted rendered first and second pseudo-realistic images.
- In certain embodiments the scene control command comprises one of: turning off at least one object of the 3D scene, changing illumination of at least a portion of the 3D scene and changing a material type for at least one of object of the 3D scene. In other certain embodiments the scene control command comprises a highlight indicator.
- In certain embodiments the first pseudo-realistic image exhibits an adjustable field of view, and wherein the rendered first pseudo-realistic image presents a view frustum responsive to the first view positional indicator and the adjustable field of view. In other certain embodiments the rendering of the first and second pseudo-realistic image comprises at least two of shading, texturing, illumination and shadowing responsive to real time orientation information in respect to latitude, longitude and elevation.
- In certain embodiments at least one of the transmitted rendered first pseudo-realistic image and the transmitted rendered second pseudo-realistic image comprises an omni-directional view. In other certain embodiments the visual model data is provided in a selectable one of a plurality of formats.
- Independently a computer-readable medium containing instructions for controlling an electronic device to perform a method of visualization, is enabled, the method comprising: loading a 3 dimensional (3D) scene comprising visual model data; rendering a pseudo-realistic image of the loaded 3D scene responsive to a first view positional indicator, the pseudo-realistic image comprising an omni-directional view; and transmitting the rendered pseudo-realistic image to at least two remote computing platforms.
- Additional features and advantages of the invention will become apparent from the following drawings and description.
- For a better understanding of the invention and to show how the same may be carried into effect, reference will now be made, purely by way of example, to the accompanying drawings in which like numerals designate corresponding elements or sections throughout.
- With specific reference now to the drawings in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of the preferred embodiments of the present invention only, and are presented in the cause of providing what is believed to be the most useful and readily understood description of the principles and conceptual aspects of the invention. In this regard, no attempt is made to show structural details of the invention in more detail than is necessary for a fundamental understanding of the invention, the description taken with the drawings making apparent to those skilled in the art how the several forms of the invention may be embodied in practice. In the accompanying drawings:
-
FIG. 1 illustrates a high level block diagram of a system providing universal collaborative visualization in accordance with an exemplary embodiment; -
FIG. 2 illustrates a rendered pseudo-realistic image in which a plurality of highlights, or indicators, each associated with a particular device ofFIG. 1 is depicted; -
FIG. 3 illustrates a high level flow chart of a method of operation of the server ofFIG. 1 to perform a method of visualization; and -
FIG. 4 illustrates a high level flow chart of a method of operation of the server ofFIG. 1 to perform a method of visualization comprising an omni-directional view on demand. - Before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not limited in its application to the details of construction and the arrangement of the components set forth in the following description or illustrated in the drawings. The invention is applicable to other embodiments or of being practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein is for the purpose of description and should not be regarded as limiting.
-
FIG. 1 illustrates a high level block diagram of asystem 10 providing universal collaborative visualization in accordance with an exemplary embodiment,system 10 comprising: aserver 20 comprising aprocessor 30, amemory 40, acommunication module 50 and a3 D scene storage 60; anetwork 70; amobile computing platform 80 comprising aprocessor 30, acommunication module 50, adisplay 90 and auser input device 100; a real timeposition determining device 110; acomputer 120 comprising aprocessor 30, acommunication module 50, adisplay 90 and auser input device 100; and amobile station 150 comprising aprocessor 30, acommunication module 50, adisplay 90 and auser input device 100. -
Processor 30 ofserver 20 is in communication withmemory 3D scene storage 60 andcommunication module 50 ofserver 20. Real timeposition determining device 110 is in communication withmobile computing platform 80, and in particular withprocessor 30 thereof.Processor 30 ofmobile computing platform 80 is in communication with each ofcommunication module 50,display 90 anduser input device 100 thereof.Processor 30 ofcomputer 120 is in communication with each ofcommunication module 50,display 90 anduser input device 100 thereof.Processor 30 ofmobile station 150 is in communication with each ofcommunication module 50,display 90 anduser input device 100 thereof. In one embodiment,network 70 is a combination of a General Packet Radio Service (GPRS) and the Internet, however this is not meant to be limiting in any way.Network 70 generally comprises a communication network arranged to enable electronic communication betweenserver 20 and each ofmobile computing platform 80,computer 120 andmobile station 150, via therespective communication module 50. The action of eachuser input device 100 is preferably communicated toserver 20, and particularly toprocessor 30 ofserver 20 via therespective communication module 50. In one non-limiting example,mobile station 150 comprises a cellular telephone, a personal digital assistant, or a hand held computer, each of which meets the definition of a computing device.Server 20 is illustrated as a stand alone server, however this is not meant to be limiting in any way. In anexemplary embodiment server 20 is implemented on a cloud computing platform, in which a dynamically scalable and often virtualized resource is provided as a service over the Internet. Preferably, operation of the method ofserver 20 is embeddable in a web site.Mobile station 150 is an example of a limiting viewing device, sincemobile station 150 is incapable of rendering a pseudo-realistic view of a complex 3D scene. - In operation, responsive to a command received from a first one of
mobile computing platform 80,computer 120 andmobile station 150,processor 30 ofserver 20 is operative responsive to computer readable instructions stored onmemory 40 to load a 3D scene comprising virtual model data from3D scene storage 60. The virtual model data is preferably supported in a plurality of formats. In one non-limiting embodiment, the 3D scene comprises Building Information Model (BIM) data.Processor 30 ofserver 20 is further operative to render a pseudo-realistic image of the loaded 3D scene, responsive to a first view positional indicator. In one particular embodiment, the first view positional indicator is a default positional indicator. Preferably, the rendered pseudo-realistic image exhibits an adjustable field of view, and the rendered pseudo-realistic image presents a view frustum responsive to the first view positional indicator. Preferably, the rendering of the pseudo-realistic image comprises at least two of shading, texturing, illumination and shadowing responsive to real time orientation information in respect to latitude, longitude and elevation. The rendered pseudo-realistic image is preferably stored inmemory 40, is preferably further associated with a session ID, and is transmitted to the source of the received command via therespective communication modules 50 andnetwork 70. For ease of understanding, we will hereinafter term the first one ofmobile computing platform 80,computer 120 andmobile station 150 which transmitted the command the initiating client. - In one embodiment the 3D scene is rendered in accordance with the teachings of U.S. patent application Ser. No. 11/538,103 to Elsberg et al, entitled “Method and Apparatus for Virtual Reality Presentation of Civil Engineering, Land Planning and Infrastructure”, published as US 2007/0078636 A1, incorporated above by reference. In another embodiment the 3D scene is developed via photogrammetry, from existing architectural plans and land survey information, via light detecting and ranging (LIDAR) and/or from existing or developed geographic information system (GIS) data.
- The initiating client is further provided with the session ID. In one embodiment, the initiating client shares the session ID with another one or more of
mobile computing platform 80,computer 120 andmobile station 150, such as by e-mail, SMS or any other form of communication. - Any of
mobile computing platform 80,computer 120 andmobile station 150, in addition to the initiating client, responsive to the received session ID, may link toserver 20. In an exemplary embodiment, each ofmobile computing platform 80,computer 120 andmobile station 150 is provided with the opportunity to download and install software which will enable on-board generation of the pseudo-realistic images, or alternatively to avoid installation of software. It is known to those skilled in the art that installation of software often requires privileges which may not be easily obtained by the average corporate user, and thus the ability to avoid installation of software while maintaining a collaborative viewing experience is particularly advantageous. -
Server 20, responsive to the provided session ID received from any ofmobile computing platform 80,computer 120 andmobile station 150, is operative to transmit the rendered pseudo-realistic image, as indicated above preferably stored inmemory 40, associated with the provided session ID, to the source of the provided session ID. Thus, both the initiating client, and one or more additional clients are provided with the same rendered pseudo-realistic image. - A user of any of
mobile computing platform 80,computer 120 andmobile station 150, connected via the provided session ID, now interacts via the respectiveuser input device 100, thus requesting a second view positional indicator, different than the first view positional indicator. Alternatively, or additionally, a user of any ofmobile computing platform 80,computer 120 andmobile station 150, connected via the provided session ID, now interacts via the respectiveuser input device 100 to request a scene control, such as one or more of turning off at least one object of said 3D scene, changing the transparency of at least one object of said 3D scene, changing illumination of at least a portion of said 3D scene and changing a material type for at least one object of said 3D scene. In an exemplary embodiment, turning of at least one element comprises adjusting the transparency of the at least one element to 100%. Adjusting the transparency, or turning off of at least one element, provides the user with extraordinary visual perception. Alternatively, sight conditions of thedisplay 3D scene may be adjusted so as to provide a simulation of reduced visibility conditions such as fog. - Responsive to the received request for a second view positional indicator or other scene control,
processor 30 ofserver 20 is operative to render a second pseudo-realistic image of the loaded 3D scene. The second rendered pseudo-realistic image is preferably stored in a cache portion ofmemory 40, preferably further associated with the session ID, and is transmitted via therespective communication modules 50 andnetwork 70 to each ofmobile computing platform 80,computer 120 andmobile station 150, connected via the provided session ID for display on therespective display 90. In one embodiment, the second rendered pseudo-realistic image is immediately transmitted only to the initiating client, and one or more additional clients are maintained in a loop requesting at each iteration if there is an updated image in thememory 40 associated with the present session ID. In the event that there is an updated image in thememory 40 associated with the present session ID,processor 30 ofserver 20 is arranged to transmit the updated image via therespective communication module 50 andnetwork 70. - In the event that any of
mobile computing platform 80,computer 120 andmobile station 150 have downloaded and installed software which enables on-board generation of the pseudo-realistic images, as described above, and is available from, inter alia, RDV Systems of Lod, Israel, the 3D scene loaded byprocessor 30 ofserver 20 is further provided to the one ofmobile computing platform 80,computer 120 andmobile station 150 that downloaded and installed the software, thus the pseudo-realistic view is locally generated by thelocal processor 30 for display on therespective display 90. In one embodiment the above mentioned received request for a second view positional indicator or other scene control is transmitted byprocessor 30 ofserver 20 to the one or more ofmobile computing platform 80,computer 120 andmobile station 150 which has downloaded and installed the image generation software. Thus, receipt of the actual rendered pseudo-realistic image is not required by the one or more ofmobile computing platform 80,computer 120 andmobile station 150 which has downloaded and installed the image generation software, since the pseudo-realistic image is generated locally by therespective processor 30 for display on therespective display 90. - In an embodiment wherein real time positioning determining
device 110 is supplied, changes in the physical position of real time positioning determiningdevice 110 are transmitted toserver 20 via therespective communication module 50 andnetwork 70 as a scene control command. Alternatively, only the actual position of real time positioning determiningdevice 110 is highlighted, or otherwise indicated on the rendered pseudo-realistic image transmitted byserver 20 and received by any ofmobile computing platform 80,computer 120 andmobile station 150 connected by the provided session ID. - In one embodiment, the pseudo-realistic image is rendered further responsive to chronographic information associated with real time positioning determining
device 110. Thus, the rendered pseudo-realistic image exhibits shadowing responsive to a calculated position of the sun; correct for the latitude, longitude, elevation and local time received from real timeposition determining device 110. - In one embodiment the 3D scene comprises at least one dynamic object, whose motion may optionally be set to be fixed. Thus, in a non-limiting example, a vehicle having a predetermined speed of travel may be displayed in the pseudo-realistic scene. In the event that the user's actual travel, in the real world, as indicated by real time positioning determining
device 110, matches the predetermined speed the user will be seen to be maintaining pace in relation to the dynamic object vehicle. - Preferably, any one or more of
mobile computing platform 80,computer 120 andmobile station 150 connected by the provided session ID may indicate a point of interest via the respectiveuser input device 100. The coordinates of the indicated point of interest are transmitted toserver 20 via therespective communication module 50 andnetwork 70, and the rendered pseudo-realistic image is updated with a highlight, or other indicator. Preferably the highlight or other indicator is unique to the session ID/particular one ofmobile computing platform 80,computer 120 andmobile station 150 connected by the provided session ID, such as a by providing a particular color for each ofmobile computing platform 80,computer 120 andmobile station 150 connected by the provided session ID. - Referring to
FIG. 2 , the above is further illustrated, in which a plurality of highlights, or indicators, 200, each associated with a particular one ofmobile computing platform 80,computer 120 andmobile station 150 are illustrated. Each of the indicators are different, with a first one ofindicators 200 being marked by a vertical/horizontal cross hatch, a second one ofindicators 200 being marked by a diagonal cross hatch and a third one ofindicators 200 being marked by a diagonal pattern. - In one non-limiting embodiment, the coordinates of the indicated point of interest are transmitted to
server 20 along with information regarding the width and height of therespective display 90 of the one ofmobile computing platform 80,computer 120 andmobile station 150 connected by the provided session ID indicating the point of interest via the respectiveuser input device 100. The coordinates are associated with the highlight, or other indicator, and transmitted to each ofmobile computing platform 80,computer 120 andmobile station 150 connected by the provided session ID, which are operative to interpolate the position of the highlight, or other indicator based on the relative dimensions of therespective display 90 associated with the one ofmobile computing platform 80,computer 120 andmobile station 150 indicating the point of interest and the dimensions of thedisplay 90 of the receiving one ofmobile computing platform 80,computer 120 andmobile station 150. Thus, the highlight or other indicator is displayed at the appropriate location of the rendered pseudo-realistic image at the receiving one ofmobile computing platform 80,computer 120 andmobile station 150 connected by the provided session ID indicating the point of interest. - In one non-limiting embodiment, the transmitted rendered pseudo-realistic image is transmitted with an omni-directional view. In particular,
server 20 generates an image that can be applied to a spherical mapping, which represents the view of the rendered pseudo-realistic image in all directions as seen from the current view position indicator. The transmitted images are mapped by arespective processor 30 of any ofmobile computing platform 80,computer 120 andmobile station 150 connected by the provided session ID on a sphere surrounding the current view position. Thus, any ofmobile computing platform 80,computer 120 andmobile station 150 connected by the provided session ID are able to rotate the direction of the view and look at any portion without requiring further downloaded information. - In one non-limiting embodiment,
processor 30 ofserver 20 is further operative to perform an analysis of at least one criteria of the visual model data loaded from3D scene storage 60, responsive to each view positional indicator.Processor 30 ofserver 20 is further operative to transmit at least one result of the performed analysis in concert with the transmitted rendered pseudo-realistic images. In a non-limiting embodiment, the criteria is sight distance, as described in published U.S. Patent Application S/N US 2008/0021680 published Jan. 24, 2008 to Elsberg et al., entitled “Method and Apparatus for Evaluating Sight Distance”, the entire contents of which is incorporated herein by reference. - The
above system 10 thus enables active collaboration in real time between disparate users, which may be geographically widely separated. In one embodiment, voice communication, or chat communication, is further enabled between the actively collaborating users, to enable real time communication and collaboration. Preferably, the one ofmobile computing platform 80,computer 120 andmobile station 150 initiating the scene command, is further provided with an indication of the success of transmission to all other users. -
FIG. 3 illustrates a high level flow chart of the method of operation ofserver 20 ofFIG. 1 to perform a method of visualization. In stage 1000 a 3D scene comprising virtual model data is loaded. Preferably, the 3D scene is stored in one of a large plurality of formats. In one non-limiting embodiment, the 3D scene comprises Building Information Model (BIM) data. - In stage 1010 a first pseudo-realistic image of the loaded 3D scene is rendered, responsive to a first view positional indicator. In one particular embodiment, the first view positional indicator is a default positional indicator. Preferably, the rendered pseudo-realistic image exhibits an adjustable field of view, and the rendered pseudo-realistic image presents a view frustum responsive to the first view positional indicator. Preferably, the rendering of the pseudo-realistic image comprises at least two of shading, texturing, illumination and shadowing responsive to real time orientation information in respect to latitude, longitude and elevation, as described further below in relation to
stage 1070. - In one embodiment the 3D scene is rendered in accordance with the teachings of U.S. patent application Ser. No. 11/538,103 to Elsberg et al, entitled “Method and Apparatus for Virtual Reality Presentation of Civil Engineering, Land Planning and Infrastructure”, published as US 2007/0078636 A1, incorporated above by reference. In another embodiment the 3D scene is developed via photogrammetry, from existing architectural plans and land survey information, via light detecting and ranging (LIDAR) and/or from existing or developed geographic information system (GIS) data.
- In
stage 1020, the rendered first pseudo-realistic image ofstage 1010 is transmitted to at least two remote computing platforms. In one non-limiting embodiment, as described above, coordination between the various remote computing platforms is accomplished responsive to a session ID. Optionally, in the event that one or more of the computing platforms has downloaded imaging rendering software, the loaded 3D scene ofstage 1000 is further transmitted. - In
stage 1030, a scene control command, generated at any of the at least two remote computing platforms, is received. The scene control command is in one embodiment a second view positional indicator, different from the first view positional indicator ofstage 1010. In another embodiment, the scene control command comprises one or more of turning off at least one object of said 3D scene, changing the transparency of at least one object of said 3D scene, changing illumination of at least a portion of said 3D scene and changing a material type for at least one object of said 3D scene. In an exemplary embodiment, turning off at least one element comprises adjusting the transparency of the at least one element to 100%. Adjusting the transparency, or turning off of at least one element, provides the user with extraordinary visual perception. Alternatively, sight conditions of thedisplay 3D scene may be adjusted so as to provide a simulation of reduced visibility conditions such as fog. In yet another embodiment, the scene control command comprises a highlight indicator. - In
stage 1040, responsive to the received scene control command ofstage 1030, a second pseudo-realistic image is rendered, in a manner in all respects similar to that described above in relation tostage 1010. Instage 1050, the rendered second pseudo-realistic image ofstage 1040 is transmitted to the at least two remote computing platforms. In one non-limiting embodiment, as described above, coordination between the various remote computing platforms is accomplished responsive to a session ID. Optionally, in the event that one or more of the computing platforms has downloaded imaging rendering software, the received scene control command is further transmitted, so that computing platforms which have downloaded the image rendering software can locally render the second pseudo-realistic image. - In
optional stage 1060, an analysis of at least one criterion of the visual model data loaded comprising the 3D scene ofstage 1000 is performed, responsive to each view positional indicator, or scene control command. Preferably, at least one result of the performed analysis is transmitted in concert with the transmitted rendered pseudo-realistic images. - In
optional stage 1070 the rendering of the pseudo-realistic image comprises at least two of shading, texturing, illumination and shadowing responsive to real time orientation information in respect to latitude, longitude and elevation. In one embodiment, at least one of the remote computing platforms is associated with a real time positioning device and/or chronographic information, and the rendering is responsive to information from at least one of the real time positioning device and chronographic information. Alternatively, the chronographic information is developed in a separate device from the computing platform associated with the real time positioning device. The rendered pseudo-realistic image exhibits shadowing responsive to a calculated position of the sun; correct for the latitude, longitude, elevation and local time received from the real time positioning device. - In
optional stage 1080, at least one of the transmitted rendered images ofstage - In
optional stage 1090, the above method is preferably further embeddable in a web site. -
FIG. 4 illustrates a high level flow chart of a method of operation ofserver 20 ofFIG. 1 to perform a method of visualization comprising an omni-directional view on demand. - In stage 2000 a 3D scene comprising virtual model data is loaded. Preferably, the 3D scene is stored in one of a large plurality of formats. In one non-limiting embodiment, the 3D scene comprises Building Information Model (BIM) data.
- In stage 2010 a first pseudo-realistic image of the loaded 3D scene is rendered, responsive to a first view positional indicator. In one particular embodiment, the first view positional indicator is a default positional indicator. Preferably, the rendered pseudo-realistic image exhibits an adjustable field of view, and the rendered pseudo-realistic image presents a view frustum responsive to the first view positional indicator. Preferably, the rendering of the pseudo-realistic image comprises at least two of shading, texturing, illumination and shadowing responsive to real time orientation information in respect to latitude, longitude and elevation, as described further above in relation to stage 1070 of
FIG. 3 . - In one embodiment the 3D scene is rendered in accordance with the teachings of U.S. patent application Ser. No. 11/538,103 to Elsberg et al, entitled “Method and Apparatus for Virtual Reality Presentation of Civil Engineering, Land Planning and Infrastructure”, published as US 2007/0078636 A1, incorporated above by reference. In another embodiment the 3D scene is developed via photogrammetry, from existing architectural plans and land survey information, via light detecting and ranging (LIDAR) and/or from existing or developed geographic information system (GIS) data.
- In
stage 2020, the rendered first pseudo-realistic image ofstage 2010 is transmitted to a remote computing platform. In one non-limiting embodiment, the remote computing platform comprises one of a mobile computer, a fixed workstation, a cellular telephone, a personal digital assistant and a hand held computer. Instage 2030, a scene control command generated by the remote compute platform is received. The scene control command is in one embodiment a second view positional indicator, different from the first view positional indicator ofstage 2010. In another embodiment, the scene control command comprises one or more of turning off at least one object of said 3D scene, changing the transparency of at least one object of said 3D scene, changing illumination of at least a portion of said 3D scene and changing a material type for at least one object of said 3D scene. In an exemplary embodiment, turning off at least one element comprises adjusting the transparency of the at least one element to 100%. Adjusting the transparency, or turning off of at least one element, provides the user with extraordinary visual perception. Alternatively, sight conditions of thedisplay 3D scene may be adjusted so as to provide a simulation of reduced visibility conditions such as fog. In yet another embodiment, the scene control command comprises a highlight indicator. - In
stage 2040, responsive to the received scene control command ofstage 2030, a second pseudo-realistic image is rendered, in a manner in all respects similar to that described above in relation tostage 2010. Instage 2050, the rendered second pseudo-realistic image ofstage 2040 is transmitted to the remote computing platform. - In
optional stage 2060, an animation request is received from the remote computing platform. In one non-limiting embodiment, the animation request is associated with a predefined animated camera path associated with the rendered 3D scene ofstage 2000. The animation is generated and streamed to the remote computing platform. Preferably the animation is pre-generated and indexed with positional information as a relation to timestamps in the animation. - In
stage 2070, a request for an omni-directional view is received from the remote computing platform associated with a particular view positional indicator. In a non-limiting embodiment, in whichoptional stage 2060 is implemented, the view positional indicator is determined responsive to a stop, or pause, request received at a position in the streamed animation ofstage 2060. - In
stage 2080, an omni-directional view is generated at the particular view positional indicator ofstage 2070, by rendering a plurality of views representing the 3D scene in all directions at the particular view positional indicator ofstage 2070. Instage 2090, the rendered plurality of views representing the generated omni-directional view at the particular view positional indicator ofstage 2070 is transmitted to the remote computing platform. - Thus, certain of the present embodiments enable an electronic device to perform a method of visualization, the method comprising: loading a 3 dimensional (3D) scene comprising visual model data; rendering a first pseudo-realistic image of the loaded 3D scene responsive to a first view positional indicator; transmitting the rendered first pseudo-realistic image to at least two remote computing platforms; receiving from any of the at least two remote computing platforms a scene control command; rendering a second pseudo-realistic image of the loaded 3D scene responsive to the received scene control command; and transmitting the rendered second pseudo-realistic image to the at least two remote computing platforms.
- The server is thereby arranged to provide a collaborative interactive viewing experience between disparate devices. In an exemplary embodiment, one of the computing platforms is a cellular telephone, and another of the computing platforms is a portable computer. Either of the devices can provide input to the shared collaborative viewing experience. In one particular embodiment, a third computing platform is provided, the third computing platform arranged to generate the 3D scene internally responsive to positional indicator information.
- It is appreciated that certain features of the invention, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the invention which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable sub-combination.
- Unless otherwise defined, all technical and scientific terms used herein have the same meanings as are commonly understood by one of ordinary skill in the art to which this invention belongs. Although methods similar or equivalent to those described herein can be used in the practice or testing of the present invention, suitable methods are described herein.
- All publications, patent applications, patents, and other references mentioned herein are incorporated by reference in their entirety. In case of conflict, the patent specification, including definitions, will prevail. In addition, the materials, methods, and examples are illustrative only and not intended to be limiting.
- It will be appreciated by persons skilled in the art that the present invention is not limited to what has been particularly shown and described hereinabove. Rather the scope of the present invention is defined by the appended claims and includes both combinations and sub-combinations of the various features described hereinabove as well as variations and modifications thereof, which would occur to persons skilled in the art upon reading the foregoing description.
Claims (20)
1. A computer-readable medium containing instructions for controlling an electronic device to perform a method of visualization, the method comprising:
loading a 3 dimensional (3D) scene comprising visual model data;
rendering a first pseudo-realistic image of said loaded 3D scene responsive to a first view positional indicator;
transmitting said rendered first pseudo-realistic image to at least two remote computing platforms;
receiving from any of said at least two remote computing platform a scene control command;
rendering a second pseudo-realistic image of said loaded 3D scene responsive to said received scene control command; and
transmitting said rendered second pseudo-realistic image to said at least two remote computing platforms.
2. The computer-readable medium of claim 1 , wherein said scene control command comprises a second view positional indicator different from said first view positional indicator.
3. The computer-readable medium of claim 2 , wherein said method further comprises:
performing an analysis of at least one criterion of said visual model data responsive to each of said first and second positional indicators; and
transmitting at least one result of said performed analysis in concert with said respective transmitted rendered first and second pseudo-realistic images.
4. The computer-readable medium of claim 1 , wherein said scene control command comprises one of turning off at least one object of said 3D scene, changing illumination of at least a portion of said 3D scene and changing a material type for at least one of object of said 3D scene.
5. The computer-readable medium of claim 1 , wherein said scene control command comprises a highlight indicator.
6. The computer-readable medium of claim 1 , wherein said first pseudo-realistic image exhibits an adjustable field of view, and wherein said rendered first pseudo-realistic image presents a view frustum responsive to said first view positional indicator and said adjustable field of view.
7. The computer-readable medium of claim 1 , wherein said rendering of said first and second pseudo-realistic image comprises at least two of shading, texturing, illumination and shadowing responsive to real time orientation information in respect to latitude, longitude and elevation.
8. The computer-readable medium of claim 1 , wherein at least one of said transmitted rendered first pseudo-realistic image and said transmitted rendered second pseudo-realistic image comprises an omni-directional view.
9. The computer-readable medium of claim 1 , wherein said visual model data is provided in a selectable one of a plurality of formats.
10. The computer-readable medium of claim 1 , wherein the computer readable medium is embeddable into a web site.
11. A server comprising a computing device and a communication module, said computing device arranged to:
load a 3 dimensional (3D) scene comprising visual model data;
render a first pseudo-realistic image of said loaded 3D scene responsive to a first view positional indicator;
transmit said rendered first pseudo-realistic image via said communication module to at least two remote computing platforms;
receive, via said communication module, from any of said at least two remote computing platforms a scene control command;
render a second pseudo-realistic image of said loaded 3D scene responsive to said received scene control command; and
transmit said rendered second pseudo-realistic image to said at least two remote computing platforms.
12. The server of claim 11 , wherein said scene control command comprises a second view positional indicator different from said first view positional indicator.
13. The server of claim 12 , wherein said computing device is further arranged to:
perform an analysis of at least one criterion of said visual model data responsive to each of said first and second positional indicators; and
transmit at least one result of said performed analysis to said at least two remote computing platforms in concert with said respective transmitted rendered first and second pseudo-realistic images
14. The server of claim 11 , wherein said scene control command comprises one of turning off at least one object of said 3D scene, changing illumination of at least a portion of said 3D scene and changing a material type for at least one object of said 3D scene.
15. The server of claim 11 , wherein said scene control command comprises a highlight indicator.
16. The server of claim 11 , wherein said first pseudo-realistic image exhibits an adjustable field of view, and wherein said rendered first pseudo-realistic image presents a view frustum responsive to said first view positional indicator and said adjustable field of view.
17. The server of claim 11 , wherein said rendering of said first and second pseudo-realistic images comprises at least two of shading, texturing, illumination and shadowing responsive to real time orientation information in respect to latitude, longitude and elevation.
18. The server of claim 11 , wherein at least one of said transmitted rendered first pseudo-realistic image and said transmitted rendered second pseudo-realistic image comprises an omni-directional view.
19. The server of claim 11 , wherein said visual model data is provided in a selectable one of a plurality of formats.
20. A computer-readable medium containing instructions for controlling an electronic device to perform a method of visualization, the method comprising:
loading a 3 dimensional (3D) scene comprising visual model data;
rendering a pseudo-realistic image of said loaded 3D scene responsive to a first view positional indicator, said pseudo-realistic image comprising an omni-directional view; and
transmitting said rendered pseudo-realistic image to at least two remote computing platforms.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/120,721 US20110169826A1 (en) | 2008-09-28 | 2009-09-29 | Universal collaborative pseudo-realistic viewer |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10073408P | 2008-09-28 | 2008-09-28 | |
US13/120,721 US20110169826A1 (en) | 2008-09-28 | 2009-09-29 | Universal collaborative pseudo-realistic viewer |
PCT/IL2009/000927 WO2010035266A2 (en) | 2008-09-28 | 2009-09-29 | Universal collaborative pseudo-realistic viewer |
Publications (1)
Publication Number | Publication Date |
---|---|
US20110169826A1 true US20110169826A1 (en) | 2011-07-14 |
Family
ID=42060195
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/558,509 Expired - Fee Related US8427473B2 (en) | 2008-09-28 | 2009-09-12 | Pseudo-realistic rendering of BIM data responsive to positional indicator |
US13/120,721 Abandoned US20110169826A1 (en) | 2008-09-28 | 2009-09-29 | Universal collaborative pseudo-realistic viewer |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/558,509 Expired - Fee Related US8427473B2 (en) | 2008-09-28 | 2009-09-12 | Pseudo-realistic rendering of BIM data responsive to positional indicator |
Country Status (2)
Country | Link |
---|---|
US (2) | US8427473B2 (en) |
WO (1) | WO2010035266A2 (en) |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110276886A1 (en) * | 2010-05-05 | 2011-11-10 | Eric Hall | System and method for managing facility content and equipment information |
US20120268463A1 (en) * | 2009-11-24 | 2012-10-25 | Ice Edge Business Solutions | Securely sharing design renderings over a network |
US20120323534A1 (en) * | 2011-06-15 | 2012-12-20 | Kent Kahle | Method of placing a total station in a building |
WO2013074568A1 (en) * | 2011-11-15 | 2013-05-23 | Trimble Navigation Limited | Browser-based collaborative development of a 3d model |
WO2013078041A1 (en) * | 2011-11-22 | 2013-05-30 | Trimble Navigation Limited | 3d modeling system distributed between a client device web browser and a server |
US8751950B2 (en) | 2004-08-17 | 2014-06-10 | Ice Edge Business Solutions Ltd. | Capturing a user's intent in design software |
US8762877B2 (en) | 2003-09-30 | 2014-06-24 | Ice Edge Business Solutions Ltd. | Creation and modification of valid functional design layouts |
US8762941B2 (en) | 2006-02-16 | 2014-06-24 | Dirtt Environmental Solutions, Ltd. | Rendering and modifying CAD design entities in object-oriented applications |
US20140229865A1 (en) * | 2013-02-14 | 2014-08-14 | TeamUp Technologies, Inc. | Collaborative, multi-user system for viewing, rendering, and editing 3d assets |
US20150169791A1 (en) * | 2013-12-16 | 2015-06-18 | Latista Technologies, Inc. | Project management system providing optimized interaction with digital models |
US20150193711A1 (en) * | 2014-01-09 | 2015-07-09 | Latista Technologies, Inc. | Project Management System Providing Interactive Issue Creation and Management |
US20150248503A1 (en) * | 2014-03-01 | 2015-09-03 | Benjamin F. GLUNZ | Method and system for creating 3d models from 2d data for building information modeling (bim) |
US9189571B2 (en) | 2011-06-11 | 2015-11-17 | Ice Edge Business Solutions, Ltd. | Automated re-use of structural components |
US9218692B2 (en) | 2011-11-15 | 2015-12-22 | Trimble Navigation Limited | Controlling rights to a drawing in a three-dimensional modeling environment |
US9323871B2 (en) | 2011-06-27 | 2016-04-26 | Trimble Navigation Limited | Collaborative development of a model on a network |
US9519407B2 (en) | 2008-03-11 | 2016-12-13 | Ice Edge Business Solutions, Ltd. | Automatically creating and modifying furniture layouts in design software |
US9595127B2 (en) | 2010-12-22 | 2017-03-14 | Zspace, Inc. | Three-dimensional collaboration |
US9782936B2 (en) | 2014-03-01 | 2017-10-10 | Anguleris Technologies, Llc | Method and system for creating composite 3D models for building information modeling (BIM) |
US9891791B1 (en) * | 2013-07-18 | 2018-02-13 | Autodesk, Inc. | Generating an interactive graph from a building information model |
US9898852B2 (en) | 2011-11-15 | 2018-02-20 | Trimble Navigation Limited | Providing a real-time shared viewing experience in a three-dimensional modeling environment |
US10510044B2 (en) | 2014-01-09 | 2019-12-17 | Latista Technologies, Inc. | Project management system providing digital form-based inspections in the field |
Families Citing this family (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8427473B2 (en) * | 2008-09-28 | 2013-04-23 | Rdv Systems Ltd. | Pseudo-realistic rendering of BIM data responsive to positional indicator |
US20120150573A1 (en) * | 2010-12-13 | 2012-06-14 | Omar Soubra | Real-time site monitoring design |
US9898862B2 (en) * | 2011-03-16 | 2018-02-20 | Oldcastle Buildingenvelope, Inc. | System and method for modeling buildings and building products |
US9342928B2 (en) | 2011-06-29 | 2016-05-17 | Honeywell International Inc. | Systems and methods for presenting building information |
US9019269B1 (en) * | 2011-11-28 | 2015-04-28 | Robert Alan Pogue | Interactive rendering of building information model data |
WO2014026021A1 (en) * | 2012-08-08 | 2014-02-13 | Virginia Tech Intellectual Properties, Inc. | Systems and methods for image-based searching |
US9639516B2 (en) * | 2012-11-08 | 2017-05-02 | Solibri, Inc. | System and method for express spreadsheet visualization for building information modeling |
US10262460B2 (en) * | 2012-11-30 | 2019-04-16 | Honeywell International Inc. | Three dimensional panorama image generation systems and methods |
US20140365558A1 (en) * | 2013-06-05 | 2014-12-11 | Wolfgis, Llc | System and method for visualizing complex gis location-based datasets |
US9204018B2 (en) | 2014-01-21 | 2015-12-01 | Carbon Objects, Inc. | System and method of adjusting the color of image objects based on chained reference points, gradient characterization, and pre-stored indicators of environmental lighting conditions |
WO2016033345A1 (en) * | 2014-08-29 | 2016-03-03 | Anguleris Technologies, Llc | Method and system for creating composite 3d models for building information modeling (bim) |
CN104794314A (en) * | 2015-05-18 | 2015-07-22 | 国家电网公司 | Three-dimensional simulation model of power transmission line and construction method of three-dimensional simulation model |
US10217242B1 (en) | 2015-05-28 | 2019-02-26 | Certainteed Corporation | System for visualization of a building material |
CN105334862A (en) * | 2015-10-28 | 2016-02-17 | 上海同筑信息科技有限公司 | BIM-based unmanned aerial vehicle monitoring method and system |
US10949805B2 (en) | 2015-11-06 | 2021-03-16 | Anguleris Technologies, Llc | Method and system for native object collaboration, revision and analytics for BIM and other design platforms |
US10867282B2 (en) | 2015-11-06 | 2020-12-15 | Anguleris Technologies, Llc | Method and system for GPS enabled model and site interaction and collaboration for BIM and other design platforms |
CN105652830B (en) * | 2015-12-25 | 2018-11-30 | 中国铁道科学研究院铁道建筑研究所 | A kind of bridge monitoring system based on BIM |
WO2017132636A1 (en) | 2016-01-29 | 2017-08-03 | Pointivo, Inc. | Systems and methods for extracting information about objects from scene information |
US9822509B1 (en) * | 2016-05-02 | 2017-11-21 | Caterpillar Inc. | Method of controlling machines at a worksite |
CN106373357A (en) * | 2016-08-30 | 2017-02-01 | 孟玲 | Bridge structure health monitoring system based on big data concept |
CN108429781A (en) * | 2017-08-12 | 2018-08-21 | 中民筑友科技投资有限公司 | A kind of file delivery method and file delivery system based on BIM |
CN109085966B (en) * | 2018-06-15 | 2020-09-08 | 广东康云多维视觉智能科技有限公司 | Three-dimensional display system and method based on cloud computing |
US11195324B1 (en) | 2018-08-14 | 2021-12-07 | Certainteed Llc | Systems and methods for visualization of building structures |
US10997553B2 (en) | 2018-10-29 | 2021-05-04 | DIGIBILT, Inc. | Method and system for automatically creating a bill of materials |
US11030709B2 (en) | 2018-10-29 | 2021-06-08 | DIGIBILT, Inc. | Method and system for automatically creating and assigning assembly labor activities (ALAs) to a bill of materials (BOM) |
DE102019105015A1 (en) * | 2019-02-27 | 2020-08-27 | Peri Gmbh | Construction of formwork and scaffolding using mobile devices |
US11475176B2 (en) | 2019-05-31 | 2022-10-18 | Anguleris Technologies, Llc | Method and system for automatically ordering and fulfilling architecture, design and construction product sample requests |
CN110610540B (en) * | 2019-09-05 | 2023-05-02 | 广州图测智能科技有限公司 | Quick rendering and plotting method of urban live-action three-dimensional model based on BIM and GIS |
CN114020977A (en) * | 2020-12-31 | 2022-02-08 | 万翼科技有限公司 | Building information model interaction method and related device |
US11778407B2 (en) * | 2021-08-10 | 2023-10-03 | Plantronics, Inc. | Camera-view acoustic fence |
US12079889B2 (en) | 2022-03-16 | 2024-09-03 | AkitaBox, Inc. | Method and system for capital management with custom assemblies and schedulable cost lines |
CN115761059B (en) * | 2022-12-14 | 2023-07-04 | 浙江中博信息工程有限公司 | BIM park-based space management system |
CN118014538B (en) * | 2024-04-09 | 2024-07-05 | 中国电建集团市政规划设计研究院有限公司 | Building engineering management method and system based on BIM and GIS |
Citations (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5832106A (en) * | 1996-05-22 | 1998-11-03 | Electronics And Telecommunications Research Institute | Method for camera calibration of range imaging system by use of neural network |
US5905657A (en) * | 1996-12-19 | 1999-05-18 | Schlumberger Technology Corporation | Performing geoscience interpretation with simulated data |
US5936626A (en) * | 1996-02-07 | 1999-08-10 | Silicon Graphics, Inc. | Computer graphics silhouette load management |
US6084979A (en) * | 1996-06-20 | 2000-07-04 | Carnegie Mellon University | Method for creating virtual reality |
US6128577A (en) * | 1996-12-19 | 2000-10-03 | Schlumberger Technology Corporation | Modeling geological structures and properties |
US6313837B1 (en) * | 1998-09-29 | 2001-11-06 | Schlumberger Technology Corporation | Modeling at more than one level of resolution |
US6421116B1 (en) * | 1999-08-23 | 2002-07-16 | Bodenseewerk Geratetechnik Gmbh | Method for determining the relative movement between missile and target |
US20020093538A1 (en) * | 2000-08-22 | 2002-07-18 | Bruce Carlin | Network-linked interactive three-dimensional composition and display of saleable objects in situ in viewer-selected scenes for purposes of object promotion and procurement, and generation of object advertisements |
US6437823B1 (en) * | 1999-04-30 | 2002-08-20 | Microsoft Corporation | Method and system for calibrating digital cameras |
US20020144278A1 (en) * | 1999-07-26 | 2002-10-03 | Pratts Edwin Daniel | System for transmitting desired digital media and audio signals in a 3-dimensional holographic format via computer network |
US20030012409A1 (en) * | 2001-07-10 | 2003-01-16 | Overton Kenneth J. | Method and system for measurement of the duration an area is included in an image stream |
US20030103049A1 (en) * | 2001-11-30 | 2003-06-05 | Caterpillar Inc. | Cuts removal system for triangulated CAD models |
US6692357B2 (en) * | 1998-11-19 | 2004-02-17 | Nintendo Co., Ltd. | Video game apparatus and method with enhanced player object action control |
US6700573B2 (en) * | 2001-11-07 | 2004-03-02 | Novalogic, Inc. | Method for rendering realistic terrain simulation |
US20040105573A1 (en) * | 2002-10-15 | 2004-06-03 | Ulrich Neumann | Augmented virtual environments |
US6816187B1 (en) * | 1999-06-08 | 2004-11-09 | Sony Corporation | Camera calibration apparatus and method, image processing apparatus and method, program providing medium, and camera |
US20050091016A1 (en) * | 2003-09-29 | 2005-04-28 | Autodesk, Inc. | Surface smoothing techniques |
US6910001B2 (en) * | 2000-03-22 | 2005-06-21 | Schlumberger Technology Corp. | Distributed multiresolution geometry modeling system and method |
US6930715B1 (en) * | 2000-07-21 | 2005-08-16 | The Research Foundation Of The State University Of New York | Method, system and program product for augmenting an image of a scene with information about the scene |
US20070016372A1 (en) * | 2005-07-14 | 2007-01-18 | Gm Global Technology Operations, Inc. | Remote Perspective Vehicle Environment Observation System |
US20080012863A1 (en) * | 2006-03-14 | 2008-01-17 | Kaon Interactive | Product visualization and interaction systems and methods thereof |
US20080027684A1 (en) * | 2005-06-23 | 2008-01-31 | Autodesk, Inc. | Graphical user interface for interactive construction of typical cross-section frameworks |
US20080229234A1 (en) * | 2007-03-16 | 2008-09-18 | The Mathworks, Inc. | Thin client graphical presentation and manipulation application |
US20080297505A1 (en) * | 2007-05-30 | 2008-12-04 | Rdv Systems, Ltd. | Method and apparatus for real-time 3d viewer with ray trace on demand |
US20090128554A1 (en) * | 2007-11-19 | 2009-05-21 | Rdv Systems, Ltd. | Method and apparatus for determining view impact |
US7692649B2 (en) * | 2005-10-04 | 2010-04-06 | Rdv Systems Ltd. | Method and apparatus for virtual reality presentation of civil engineering, land planning and infrastructure |
US20100110071A1 (en) * | 2008-09-28 | 2010-05-06 | Rdv Systems, Ltd. | Pseudo-realistic rendering of bim data responsive to positional indicator |
US7978192B2 (en) * | 2005-10-04 | 2011-07-12 | Rdv Systems Ltd. | Method and apparatus for evaluating sight distance |
US8031210B2 (en) * | 2007-09-30 | 2011-10-04 | Rdv Systems Ltd. | Method and apparatus for creating a composite image |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080062167A1 (en) * | 2006-09-13 | 2008-03-13 | International Design And Construction Online, Inc. | Computer-based system and method for providing situational awareness for a structure using three-dimensional modeling |
-
2009
- 2009-09-12 US US12/558,509 patent/US8427473B2/en not_active Expired - Fee Related
- 2009-09-29 US US13/120,721 patent/US20110169826A1/en not_active Abandoned
- 2009-09-29 WO PCT/IL2009/000927 patent/WO2010035266A2/en active Application Filing
Patent Citations (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5936626A (en) * | 1996-02-07 | 1999-08-10 | Silicon Graphics, Inc. | Computer graphics silhouette load management |
US5832106A (en) * | 1996-05-22 | 1998-11-03 | Electronics And Telecommunications Research Institute | Method for camera calibration of range imaging system by use of neural network |
US6084979A (en) * | 1996-06-20 | 2000-07-04 | Carnegie Mellon University | Method for creating virtual reality |
US5905657A (en) * | 1996-12-19 | 1999-05-18 | Schlumberger Technology Corporation | Performing geoscience interpretation with simulated data |
US6128577A (en) * | 1996-12-19 | 2000-10-03 | Schlumberger Technology Corporation | Modeling geological structures and properties |
US6313837B1 (en) * | 1998-09-29 | 2001-11-06 | Schlumberger Technology Corporation | Modeling at more than one level of resolution |
US6692357B2 (en) * | 1998-11-19 | 2004-02-17 | Nintendo Co., Ltd. | Video game apparatus and method with enhanced player object action control |
US6437823B1 (en) * | 1999-04-30 | 2002-08-20 | Microsoft Corporation | Method and system for calibrating digital cameras |
US6816187B1 (en) * | 1999-06-08 | 2004-11-09 | Sony Corporation | Camera calibration apparatus and method, image processing apparatus and method, program providing medium, and camera |
US20020144278A1 (en) * | 1999-07-26 | 2002-10-03 | Pratts Edwin Daniel | System for transmitting desired digital media and audio signals in a 3-dimensional holographic format via computer network |
US6421116B1 (en) * | 1999-08-23 | 2002-07-16 | Bodenseewerk Geratetechnik Gmbh | Method for determining the relative movement between missile and target |
US6910001B2 (en) * | 2000-03-22 | 2005-06-21 | Schlumberger Technology Corp. | Distributed multiresolution geometry modeling system and method |
US6930715B1 (en) * | 2000-07-21 | 2005-08-16 | The Research Foundation Of The State University Of New York | Method, system and program product for augmenting an image of a scene with information about the scene |
US20020093538A1 (en) * | 2000-08-22 | 2002-07-18 | Bruce Carlin | Network-linked interactive three-dimensional composition and display of saleable objects in situ in viewer-selected scenes for purposes of object promotion and procurement, and generation of object advertisements |
US20030012409A1 (en) * | 2001-07-10 | 2003-01-16 | Overton Kenneth J. | Method and system for measurement of the duration an area is included in an image stream |
US6700573B2 (en) * | 2001-11-07 | 2004-03-02 | Novalogic, Inc. | Method for rendering realistic terrain simulation |
US20030103049A1 (en) * | 2001-11-30 | 2003-06-05 | Caterpillar Inc. | Cuts removal system for triangulated CAD models |
US20040105573A1 (en) * | 2002-10-15 | 2004-06-03 | Ulrich Neumann | Augmented virtual environments |
US20050091016A1 (en) * | 2003-09-29 | 2005-04-28 | Autodesk, Inc. | Surface smoothing techniques |
US20080027684A1 (en) * | 2005-06-23 | 2008-01-31 | Autodesk, Inc. | Graphical user interface for interactive construction of typical cross-section frameworks |
US20070016372A1 (en) * | 2005-07-14 | 2007-01-18 | Gm Global Technology Operations, Inc. | Remote Perspective Vehicle Environment Observation System |
US7692649B2 (en) * | 2005-10-04 | 2010-04-06 | Rdv Systems Ltd. | Method and apparatus for virtual reality presentation of civil engineering, land planning and infrastructure |
US7978192B2 (en) * | 2005-10-04 | 2011-07-12 | Rdv Systems Ltd. | Method and apparatus for evaluating sight distance |
US20080012863A1 (en) * | 2006-03-14 | 2008-01-17 | Kaon Interactive | Product visualization and interaction systems and methods thereof |
US20080229234A1 (en) * | 2007-03-16 | 2008-09-18 | The Mathworks, Inc. | Thin client graphical presentation and manipulation application |
US20080297505A1 (en) * | 2007-05-30 | 2008-12-04 | Rdv Systems, Ltd. | Method and apparatus for real-time 3d viewer with ray trace on demand |
US8031210B2 (en) * | 2007-09-30 | 2011-10-04 | Rdv Systems Ltd. | Method and apparatus for creating a composite image |
US20090128554A1 (en) * | 2007-11-19 | 2009-05-21 | Rdv Systems, Ltd. | Method and apparatus for determining view impact |
US20100110071A1 (en) * | 2008-09-28 | 2010-05-06 | Rdv Systems, Ltd. | Pseudo-realistic rendering of bim data responsive to positional indicator |
Cited By (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8762877B2 (en) | 2003-09-30 | 2014-06-24 | Ice Edge Business Solutions Ltd. | Creation and modification of valid functional design layouts |
US9536340B2 (en) | 2004-08-17 | 2017-01-03 | Dirtt Environmental Solutions, Ltd. | Software incorporating efficient 3-D rendering |
US8751950B2 (en) | 2004-08-17 | 2014-06-10 | Ice Edge Business Solutions Ltd. | Capturing a user's intent in design software |
US8762941B2 (en) | 2006-02-16 | 2014-06-24 | Dirtt Environmental Solutions, Ltd. | Rendering and modifying CAD design entities in object-oriented applications |
US9519407B2 (en) | 2008-03-11 | 2016-12-13 | Ice Edge Business Solutions, Ltd. | Automatically creating and modifying furniture layouts in design software |
US20120268463A1 (en) * | 2009-11-24 | 2012-10-25 | Ice Edge Business Solutions | Securely sharing design renderings over a network |
US9245064B2 (en) * | 2009-11-24 | 2016-01-26 | Ice Edge Business Solutions | Securely sharing design renderings over a network |
US20110276886A1 (en) * | 2010-05-05 | 2011-11-10 | Eric Hall | System and method for managing facility content and equipment information |
US9064219B2 (en) * | 2010-05-05 | 2015-06-23 | J. E. Dunn Construction Group, Inc. | System and method for managing facility content and equipment information |
US9595127B2 (en) | 2010-12-22 | 2017-03-14 | Zspace, Inc. | Three-dimensional collaboration |
US9189571B2 (en) | 2011-06-11 | 2015-11-17 | Ice Edge Business Solutions, Ltd. | Automated re-use of structural components |
US9879994B2 (en) * | 2011-06-15 | 2018-01-30 | Trimble Inc. | Method of placing a total station in a building |
US20120323534A1 (en) * | 2011-06-15 | 2012-12-20 | Kent Kahle | Method of placing a total station in a building |
US9323871B2 (en) | 2011-06-27 | 2016-04-26 | Trimble Navigation Limited | Collaborative development of a model on a network |
US9898852B2 (en) | 2011-11-15 | 2018-02-20 | Trimble Navigation Limited | Providing a real-time shared viewing experience in a three-dimensional modeling environment |
WO2013074568A1 (en) * | 2011-11-15 | 2013-05-23 | Trimble Navigation Limited | Browser-based collaborative development of a 3d model |
US9218692B2 (en) | 2011-11-15 | 2015-12-22 | Trimble Navigation Limited | Controlling rights to a drawing in a three-dimensional modeling environment |
US9460542B2 (en) | 2011-11-15 | 2016-10-04 | Trimble Navigation Limited | Browser-based collaborative development of a 3D model |
WO2013078041A1 (en) * | 2011-11-22 | 2013-05-30 | Trimble Navigation Limited | 3d modeling system distributed between a client device web browser and a server |
US10868890B2 (en) | 2011-11-22 | 2020-12-15 | Trimble Navigation Limited | 3D modeling system distributed between a client device web browser and a server |
US20140229865A1 (en) * | 2013-02-14 | 2014-08-14 | TeamUp Technologies, Inc. | Collaborative, multi-user system for viewing, rendering, and editing 3d assets |
US10345989B2 (en) * | 2013-02-14 | 2019-07-09 | Autodesk, Inc. | Collaborative, multi-user system for viewing, rendering, and editing 3D assets |
US11023094B2 (en) | 2013-02-14 | 2021-06-01 | Autodesk, Inc. | Collaborative, multi-user system for viewing, rendering, and editing 3D assets |
US9891791B1 (en) * | 2013-07-18 | 2018-02-13 | Autodesk, Inc. | Generating an interactive graph from a building information model |
WO2015095141A1 (en) * | 2013-12-16 | 2015-06-25 | Latista Technologies, Inc. | Project management system providing optimized interaction with digital models |
US10372839B2 (en) * | 2013-12-16 | 2019-08-06 | Latista Technologies, Inc. | Project management system providing optimized interaction with digital models |
RU2644506C2 (en) * | 2013-12-16 | 2018-02-12 | Латиста Текнолоджиз, Инк. | Project management system for providing optimal interaction with digital models |
US20150169791A1 (en) * | 2013-12-16 | 2015-06-18 | Latista Technologies, Inc. | Project management system providing optimized interaction with digital models |
US20150193711A1 (en) * | 2014-01-09 | 2015-07-09 | Latista Technologies, Inc. | Project Management System Providing Interactive Issue Creation and Management |
US10127507B2 (en) * | 2014-01-09 | 2018-11-13 | Latista Technologies, Inc. | Project management system providing interactive issue creation and management |
US10510044B2 (en) | 2014-01-09 | 2019-12-17 | Latista Technologies, Inc. | Project management system providing digital form-based inspections in the field |
US20150248503A1 (en) * | 2014-03-01 | 2015-09-03 | Benjamin F. GLUNZ | Method and system for creating 3d models from 2d data for building information modeling (bim) |
US9817922B2 (en) * | 2014-03-01 | 2017-11-14 | Anguleris Technologies, Llc | Method and system for creating 3D models from 2D data for building information modeling (BIM) |
US9782936B2 (en) | 2014-03-01 | 2017-10-10 | Anguleris Technologies, Llc | Method and system for creating composite 3D models for building information modeling (BIM) |
Also Published As
Publication number | Publication date |
---|---|
WO2010035266A2 (en) | 2010-04-01 |
US8427473B2 (en) | 2013-04-23 |
WO2010035266A3 (en) | 2010-05-14 |
US20100110071A1 (en) | 2010-05-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20110169826A1 (en) | Universal collaborative pseudo-realistic viewer | |
EP2973422B1 (en) | Overlaying two-dimensional map data on a three-dimensional scene | |
US10533871B2 (en) | Rendering a map using style identifiers | |
WO2019079311A1 (en) | Rolling shutter correction for images captured by a camera mounted on a moving vehicle | |
US20140267273A1 (en) | System and method for overlaying two-dimensional map elements over terrain geometry | |
Santana et al. | Multimodal location based services—semantic 3D city data as virtual and augmented reality | |
CN110379010A (en) | Three-dimensional geographic information method for visualizing and system based on video fusion | |
US9286712B2 (en) | System and method for approximating cartographic projections by linear transformation | |
JP6915001B2 (en) | Displaying objects based on multiple models | |
US20080294332A1 (en) | Method for Image Based Navigation Route Corridor For 3D View on Mobile Platforms for Mobile Users | |
US20150187128A1 (en) | Lighting of graphical objects based on environmental conditions | |
US20180322143A1 (en) | Interactive Device With Three-Dimensional Display | |
US20180278722A1 (en) | Mapless user interfaces for limited network conditions | |
CN113014824A (en) | Video picture processing method and device and electronic equipment | |
Noguera et al. | A scalable architecture for 3D map navigation on mobile devices | |
CN115760667A (en) | 3D WebGIS video fusion method under weak constraint condition | |
US8643678B1 (en) | Shadow generation | |
CN117671130A (en) | Digital twin intelligent fishing port construction and use method based on oblique photography | |
US8314791B2 (en) | Method and apparatus for determining view impact | |
Lu et al. | Webvrgis: Webgis based interactive online 3d virtual community | |
Dorffner et al. | Generation and visualization of 3D photo-models using hybrid block adjustment with assumptions on the object shape | |
RU2601165C2 (en) | Method for automated creation of three-dimensional systems of urban panoramas based on laser scanning data | |
Stødle et al. | High-performance visualisation of UAV sensor and image data with raster maps and topography in 3D | |
Robles-Ortega et al. | Efficient visibility determination in urban scenes considering terrain information | |
Tsujimoto et al. | Server-based mixed-reality system for multiple devices to visualize a large architectural model and simulations |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: RDV SYSTEMS LTD., ISRAEL Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ELSBERG, NATHAN;HAZANOV, ALEX;REEL/FRAME:027180/0734 Effective date: 20111101 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |