WO2021086319A1 - Génération de données de modèle pour imprimantes tridimensionnelles - Google Patents
Génération de données de modèle pour imprimantes tridimensionnelles Download PDFInfo
- Publication number
- WO2021086319A1 WO2021086319A1 PCT/US2019/058469 US2019058469W WO2021086319A1 WO 2021086319 A1 WO2021086319 A1 WO 2021086319A1 US 2019058469 W US2019058469 W US 2019058469W WO 2021086319 A1 WO2021086319 A1 WO 2021086319A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- reference locations
- object model
- build
- source
- model
- Prior art date
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B33—ADDITIVE MANUFACTURING TECHNOLOGY
- B33Y—ADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
- B33Y50/00—Data acquisition or data processing for additive manufacturing
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B29—WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
- B29C—SHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
- B29C64/00—Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
- B29C64/30—Auxiliary operations or equipment
- B29C64/386—Data acquisition or data processing for additive manufacturing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2021—Shape modification
Definitions
- FIG. 1 shows an example process for generating an adjusted object model.
- FIG. 2 shows another example process for generating an adjusted object model.
- FIG. 3 is a flowchart showing an example method for generating object model data based on an adjusted object model.
- FIG. 4 shows an example controller to generate object model data.
- FIG. 5 shows an example of a computer readable medium comprising instructions to generate object model data.
- Many products or objects are manufactured to fit against, or otherwise interface with, another object during their use or operation.
- objects that are operated or used by a user, and interface with a body part of the user, for example user input devices such as a computer mouse or games console controller, or items designed to fit to or against a body part, such as a prosthesis, eye glasses or the insole of a shoe.
- Other examples include products designed to fit other types of naturally occurring, non-uniform object or body, such as a saddle for a horse, a birdhouse fitted to a tree branch, or a rock-climbing piton for fitting to a rock formation. There may exist discrepancies between the shape of a product and an object it is designed to interface with.
- any object to be built which is designed to fit to or against, or interface with, another object, which may be referred to as a source object or reference object, may be customised for improved interaction with the reference object.
- the present disclosure describes how a customised object may be determined, based on the particular topography of another object.
- a model of an object may be automatically adjusted by comparing reference locations in model data representing a user’s body part with corresponding reference locations in the object model. This adjustment may facilitate 3D printing of an object that is more accurately fitted to the user’s body part.
- body part model data may be used to adjust an object model and generate object model data representing an adjusted object model.
- the process of producing a customised 3D-printed object includes: (i) obtaining body part model data; (ii) adjusting an object model and generating object model data based on the adjusted object model; and (iii) 3D printing the object defined by the model, or generating the object using the generated object model data by any other form of additive manufacturing system.
- the body part model data may reflect the surface of a human body part, and may be received by a pre-print application. Based on the body part model data, an object model may be adjusted. This adjustment may be performed in accordance with the details provided below. In some examples, adjustment of the object model may be performed automatically, manually, or a combination of both, in the pre-print application.
- an automatically adjusted object model generated by the pre-print application may be accepted, rejected or modified by a user.
- This automatic adjustment of the object model may utilize machine learning techniques.
- An entire object may be printed, of which either the whole or a part has been adjusted from an initial object model, or a part of the object may be printed for attachment to a further, standard object part after 3D printing.
- a part of a computer mouse may be comprised in the object model data, which could then be attached to a standard mouse base after printing.
- an interface portion of a prosthesis such as a socket portion of a prosthetic limb, may be comprised in the object model data, which may then be customised to fit a user and attached to other standard components of the prosthesis, such as the pylon of a prosthetic limb.
- the entire object may be printed.
- a part of the object may printed so that it may be attached to another, standard object part after printing.
- This enables objects to be manufactured comprising a combination of 3D-printed parts, which may be customised, and standardised off-the-shelf parts.
- the standardised off-the-shelf parts may, for example, be 3D-printed, be molded, or be formed in any suitable manner.
- the manufactured object may therefore comprise an article, or assembled object, formed by combining a customised printed object with an additional, standard part, such that the assembled article may fit the body part or other reference object.
- FIG. 1 shows an example process for generating object model data based on body part model data and one of a set of object models.
- a scan of a body part 102 is used to generate the body part model data. While a hand is shown as the body part in this example, any part of the body may be used.
- This body part model data may be used to provide a digital representation of the body part 104 within the pre-print application 103. In some examples, this body part model data may be used to provide a visual representation of the body part, which could be moved or modified, or it may simply be provided as a data set based on which an object model may be adjusted.
- obtaining the scan may comprise scanning the body part in a 3D scanner 101 , or scanning at least a portion of the body part which is to interface with an object. While a 3D scanner is shown in FIG. 1 , in some examples multiple 2D images of the body part may instead be taken from different positions, in order to capture information about the 3D shape of the body part. These images may be taken by any system capable of taking images or photographs, such as a camera phone. In this example, the use of multiple 2D images, taken sequentially at different times from different angles or positions around the body part, may introduce inconsistencies due to the movement of the body part between different images. These inconsistencies may result in an adjusted object model that is an inaccurate representation of the user’s body part.
- the body part may be placed in a support to prevent or reduce movement during the time taken to capture the sequence of images. This may enable the body part to remain stationary between images, which may in some cases increase the accuracy of the generated body part model data.
- errors in the generation of the body part model data may arise as a result of shadows falling on the body part and/or the reflection of light from the surface of the body part, each of which may introduce inaccuracies in the obtained representation of the user’s body part.
- diffused light or low levels of light may be used so that there are no strong shadows.
- the brightness of ambient light may be selected to be below a threshold level above which shadows may occur that disrupt the accuracy of the scanning process.
- the body part may be sprayed or otherwise covered with a low reflectiveness, matt, or speckled material, in order to reduce the effect of surface reflection.
- the body part may be marked with specific spots or other markers at predetermined locations, in order that these locations may be used to generate the body part model data.
- these specific spots may be applied to predetermined locations on, for example, the fingers and/or the palm of the hand, and the body part model data may be generated based on the identified positions of these points of the hand.
- body part model data may be generated from different representations generated in relation to the user’s body part.
- the representation may be a hand print of the hand, from which data related to the hand itself is generated.
- body part model data may be generated based on a mould of the body part.
- a user may grip a soft, plastically deformable object to make the mould of their hand.
- this soft object may be a modelling clay or similar material, due to its malleability.
- the representations of the body part may then be obtained by scanning or otherwise photographing the plastically deformed object, or in some examples the object may be used as a support for the body part while the body part itself is scanned or otherwise imaged.
- Body part model data may be generated by a pre-print application, within which adjustment of an object is performed, or obtained by the pre-print application as an input.
- the body part model data also need not reflect an entire body part, but may instead represent a portion of that body part.
- body part model data may represent just the fingers and certain areas of the palm of a hand, which in themselves reflect parts of the body.
- Body part model data may comprise body part reference locations 105a - 105d, identifying particular locations on the body part.
- these body part reference locations may of different categories.
- these body part reference locations may comprise categories of: - Press locations, which may represent locations of the body part that are intended to be adjacent a user input location of the computer mouse that is activated by pressing, for example a computer mouse button; and/or
- - Rest locations which may represent locations on the fingers and/or palm that are intended to rest on a corresponding surface of the mouse.
- the reference locations could instead be any locations identified on the body part or in the body part model data.
- these body part reference locations may be identified on the body part before generating the body part model data, for example by marking the body part at the desired locations.
- the mould may be augmented with features that indicate the body part reference locations. For example, these features may be pins or markers that are added to the mould in order to identify the respective reference locations.
- body part reference locations may be determined automatically, manually or a combination of both.
- body part reference locations may be determined automatically and then updated by a user.
- the body part model data may comprise just the reference locations, whereas in some examples the body part model data may represent both the reference locations and at least a part of the surface of the body part.
- reference locations may be points and/or areas.
- the reference locations may be different for different categories of reference location, e.g. reference points for press locations and reference areas for rest locations.
- reference locations may be assigned a level of significance. This level of significance may indicate which reference locations are more or less significant to the generation of the customised object.
- particular categories of reference location may be allocated a level of significance over other categories. This level of significance may be assigned automatically, manually or a combination of both. For example, particular categories may automatically assigned as more significant than others, which may then be updated by a user to change the significance level of any of the reference locations.
- an object model 106 may be automatically selected from a predetermined set of object models.
- the body part model data may be linked to a predetermined set of object models for a given category of object, from which the object model may be selected.
- the category of object may be a computer mouse and so the predetermined set may comprise a number of computer mouse models.
- different categories of object, each having an associated set of predetermined object models, may be associated with a user’s hand or other body part, and a computer mouse may be merely one of these categories.
- Linking the object part model data to this predetermined set may comprise determining the category of object when generating the body part model data or selecting the category after receiving the body part model data.
- an object model 106 may be selected based on a comparison of the body part reference locations 105a - 105d to object reference locations 107a - 107d included in the object model.
- object reference locations 107a - 107d may correspond to the body part reference locations 105a - 105d when the respective object and body part represented by the models are placed adjacent to one another in the manner in which the object is intended to be interacted with by the user’s body part, e.g. when a user’s hand is placed against the surface of a computer mouse.
- an object model may be selected in which object reference locations 107a - 107d most closely match the positions of body part reference locations 105a - 105d, during such interaction.
- the body part reference locations and object reference locations may correspond to the same or different categories of reference location.
- certain categories of reference location may all be points or areas, respectively, while in other examples each individual reference location can be either a point or an area.
- Object reference locations may not initially be in the correct positons on a given object model when compared with the body part reference locations.
- buttons on the object model may not be in the correct positions compared to hand reference locations and so the object reference locations may accordingly be adjusted until they match a respective hand reference location.
- Each object model may, however, have constraints on the positions of the object reference locations. For example, if the object model is a mouse, the mouse model may be constrained as to how many buttons it may have, where these buttons may be located, the sizes of the buttons and the size of the object. In some examples, any constraint relating to the design of the object may be considered. In some examples, these constraints may arise as a result of particular electronics to be fitted to a printed object.
- the electronics may include a circuit board with associated wires that may be able to move within a certain range of positions, for example between the circuit board itself and a switch located at a mouse button. This may enable variation within a predetermined range of possible positions on the object that components of the object, e.g. mouse buttons, could have and the overall shape that the object could take, such that the object reference locations may be adjusted within these constraints.
- positional degrees of freedom 108a - 108d of the object reference locations may comprise a specified positional variance or tolerance of the reference location in specified direction(s). These variances or positional degrees of freedom may be the same for each object reference location, or the specified positional variations may be independent from one another for the different reference locations.
- Automatic selection of the object model may comprise selecting an object model 106 with object reference locations 107a - 107d that may be adjusted within positional degrees of freedom 108a - 108d. In some examples this may comprise selecting an object model that has object reference locations that may be adjusted until they at least closely match the body part reference locations.
- automatic selection of the object model may comprise focusing on which of the more significant reference locations are closely matched. For example, when multiple object models are determined as being suitable for the body part model data, an object model may be selected that has closely matching reference locations having a high significance level. In some examples, an object model that does not have all reference locations closely enough located may be selected because significant reference locations are closely enough located. In this way, an object model may be chosen that is more accurately customised to the user’s specifications.
- the object reference locations may be automatically adjusted. This may result in an adjusted object model 109 having adjusted object reference locations 110a - 110d.
- This automatic adjustment may comprise automatic adjustment based on any combination of:
- object reference locations may be adjusted until they match body part reference locations.
- automatic adjustment of the object reference locations may comprise adjusting the size, shape and/or orientation angle of the object represented by the object model.
- adjustment of the object reference locations may comprise focusing on adjusting more significant object reference locations over less significant object reference locations.
- reference locations may comprise reference areas and/or points.
- object reference areas may be adjusted based on a comparison with body part reference points.
- automatic adjustment may performed by adjusting the object reference areas within the positional degrees of freedom specified for the respective object reference area within the object model so that the body part reference point falls within the respective object reference area.
- object reference areas may be adjusted based on a comparison with body part reference areas.
- automatic adjustment may comprise adjusting the object reference areas within the positional degrees of freedom so that the body part reference area falls within a respective object reference area, or so that there is as great a degree of overlap as possible of the respective reference areas.
- the object model may be adjusted based on additional criteria specified in the body part model data. For example, where a hand needs to operate a computer mouse, the mouse may need to fit the wrist and arm of the user. It may be possible then to adjust the angular orientation or shape of the object model based on these additional criteria or constraints. These criteria may be added to the body part model data automatically, manually or a combination of both. The body part model data may therefore be updated to more accurately represent the user’s body part. For example, the user may change the angular orientation or shape based on their preference, or based on the particular use that will be made of the object. The user may also be able to correct for errors arising during the generation of the body part model data.
- Automatic adjustment of the object model based on the reference locations may comprise an automatic adjustment of the shape of the object model. This may comprise assigning each reference location a particular area, for example an area surrounding the reference location. Adjusting the object reference location may then result in automatic adjustment of the surrounding area based on the adjustment made to the reference location, and so will change the overall shape of the object. In some examples a reference location may be assigned a particular reference location shape, such that moving the reference location will move the reference location shape and change the overall object shape. [0030] In some examples, the object model may be adjusted using a number of adjustable object features. In the example where the object is a mouse, these adjustable mouse features may comprise:
- the mouse model may incorporate locations where buttons may be placed during generation of the mouse by 3D printing. These buttons may have associated mouse reference locations, which may be associated with corresponding hand reference locations in the hand model, and the hand reference locations may be used to adjust the mouse reference locations of the buttons in the mouse model.
- the adjustable mouse features may include the weight of the mouse when printed.
- the weight may be estimated based on a calculation of the amount of material to be used for printing the mouse.
- the weight of the mouse may be reduced in the design by reducing solid volumes in the mouse structure, but the design may allow for weights to be added at specified locations in the mouse, once manufactured, in accordance with a user’s preferences regarding weight distribution and total weight of the mouse.
- the adjustable mouse features may comprise the filling structure of internal volumes of the mouse model. For example, solid or honeycomb fillings, or other structures having a lower density than a solid volume, may be chosen.
- the honeycomb or other lower-density filling may provide air ventilation of the electronics in the mouse, reducing heat build-up and potentially reducing risk of faults after long term operation of the mouse. This may also reduce the overall weight of the mouse compared to using a solid filling.
- a honeycomb filling may be applied to areas surrounding the electronics, while a plain filling may be applied elsewhere.
- Areas of the surface of the mouse model may be allocated different textures. For example, some areas may be provided with a texture that increases grip. These textured areas may be assigned based on, for example, the rest reference locations. In some examples, these textured areas may be applied to press locations where the user might not want fingers to slip when pressing a mouse button. In some examples the mouse model may include areas that may be personalized with different markers and colours.
- adjustable object features may comprise any features of the object and the described adjustable object features may be applied to objects other than a mouse.
- Adjustable object features may be incorporated into the object model either automatically, manually or a combination of both. For example, a particular colour or texture may be applied to a particular area of the object model. A user may want to update this and so may modify the texture and/or colour that is applied to these areas. Automatic inclusion of adjustable object features may use a predetermined set of features, but it may be possible for a user to introduce new features not previously determined. For example, the user may introduce new colours or textures, or even update colours or textures in the predetermined set. Adjustable object features may be incorporated into the object model data when selecting the object model, or by receiving object feature data that may reflect adjustable object features.
- the methods of adjusting an object model detailed above may be performed in any combination, and may be performed automatically, manually or a combination of both. For example, all of these described methods may be automatically performed, and then updated by a user. In other examples, each of the described adjustments may be performed separately with user update after each respective stage.
- An adjusted object model 109 may be generated following these adjustments. This adjusted object model may comprise adjustments relating to any combination of the above described forms of adjustment.
- This adjusted object model may be modified by a user. For example, the user may modify the automatically adjusted object model to further refine the shape. In some examples the user may introduce additional reference locations, enabling further adjustment of the object model.
- Object model data may be generated based on the adjusted object model. This object model data may be generated by the pre-print application, or may be generated separately, either immediately following generation of the adjusted object model or at a later time.
- object model data may not represent an entire object, but may instead represent a part of the object.
- FIG. 2 shows an example in which the object is a mouse, and the mouse model data comprises:
- the mouse top portion 208 is adjusted based on the hand model data, but may be attached to a standard mouse base portion 209 whose dimensions are not adjusted in dependence on the hand model. In this way the top portion alone may be printed, and then attached to the standard mouse base portion after printing.
- an adjusted mouse model is generated, wherein the mouse model represents a top portion of the mouse.
- the mouse model data is obtained by the pre-print application 203.
- the hand 202 may be scanned by a 3D scanner 201 .
- the hand model data may be used to provide a visual representation 204 of the hand, whereas in other examples the hand model data will simply be a data set representing features of the shape of the hand.
- the hand model comprises hand reference points 205a - 205d, which may be automatically assigned or may be specified by a user. The reference points may be of different categories.
- reference point 205a located on a finger tip of the hand, may be “press” location or “click” point, which is a location on the hand at which a corresponding point on the mouse should be provided with a user input control such as a mouse button.
- Other reference points such as points 205b - 205d may be rest locations at which it is specified that corresponding points on the mouse should be provided with a surface on which that part of the user’s hand can rest.
- a predetermined set of mouse models each comprise alternative models of a mouse, or portion of a mouse, and each include mouse reference locations.
- mouse reference locations 211a - 211 d may be confined to the top portion 208, but have positional degrees of freedom 212a - 212d resulting from constraints of the base portion 209. These constraints may arise from the same features as detailed above.
- the positional degrees of freedom may electronics 210 that are fitted to a standard base portion 209.
- the constraints on the reference points of the top portion may include constraints determined by the dimensions of a standard base portion onto which the top portion is intended to fit.
- the mouse model may be selected based on a comparison of the hand reference locations 205a - 205d with the mouse reference locations 211a - 211 d, within specified positional degrees of freedom 212a - 212d. This selection may be based on determining a mouse model from a predetermined set of mouse models in which the mouse reference locations most closely match the specified hand reference locations, or in which the hand reference locations are able to be accommodated adjacent to respective mouse reference locations within the variations permitted by the specified degrees of freedom.
- the mouse model may then be adjusted in accordance with any combination of the above described adjustments.
- the mouse reference locations may be adjusted.
- adjusted mouse reference locations 214a - 214d may be generated by adjusting the mouse reference locations 211a - 211 d in relation to the corresponding hand reference locations 205a - 205d.
- An adjusted top portion 213 may be generated representing the adjusted mouse model.
- This adjusted mouse model may be used to generate mouse model data, which may be used for printing of the top portion of the mouse.
- the mouse model may represent any part of a mouse, and is not limited to just the whole mouse or a top portion.
- the mouse reference locations may then be located on any part of a mouse model. For example, mouse models that comprise only a small part of the mouse may be differently constrained than mouse models that comprise larger parts of the mouse. The mouse reference locations may then be constrained as a result of various parts of the mouse, and not just the base.
- the adjust top portion 213 may be used to 3D print a customised top portion of a mouse, which is customised to fit the user’s hand as specified in the hand model data, and fits a selected standard mouse base portion and electronics.
- the customised mouse top portion may have mouse buttons located in positions specified by the user as reference points in the hand model data.
- FIG. 3 shows an example of a method for generating object model data, which may be used to generate printer data comprising build data to control a 3D printer to generate an object.
- the method 300 comprises initially obtaining body part model data 301 .
- an object model is automatically selected from a predetermined set of object models.
- the object model may be selected based on the considerations described above, and in particular an object model may be selected in which the object reference locations most closely match the body part reference locations.
- the object model may represent either an entire object or a part of the object. In this way, at least a part of the object may be selected for customisation. This may comprise selecting just the part that is to be customised, or customising that part on an entire object.
- the object reference locations are automatically adjusted at 305, based on the body part reference locations. This automatic adjustment may be performed in accordance with the details described above. In some examples the object reference locations may be adjusted to match the body part reference locations. While the object reference locations may be adjusted to match the body part reference locations, this may not always be possible. For example, the number of object reference locations may not match the number of body part reference locations. In this case automatic adjustment may comprise adjusting the reference locations that match.
- the object model is automatically adjusted based on criteria of the body part model data, as described above. For example, this adjustment may comprise adjusting the angular orientation or shape of the object model, where these are not determined by the object reference locations. These criteria may already form part of the body part model data, or they may be incorporated into the body part model data at 306.
- the object model includes locations for adjustable object features. If these are present, adjustable object features may be obtained at 308 and incorporated into the object model 309.
- the object model may comprise locations that have different textures, and a number of textures may be applied to the model.
- the adjusted object model is modified based on user input at 311. For example, the user may modify the particular shape of the adjusted object model and/or modify the adjustable object features.
- object model data is generated at 312, based on the adjusted object model. Where the object model represents only a part of the object, the generation of the object model data may comprise generating object model data for that part, or the entire object incorporating that customised part.
- the object model data generated at 312 may be used for generating printer data for a 3D printer.
- FIG. 3 illustrates an example method for the generation of object model data, which may then be used to generate print data.
- the user may be able to adjust the body part reference locations immediately after obtaining or determining them.
- the user may also be able to add or remove body part reference locations.
- the ability to update automatically generated reference locations may enable the user to correct for errors in this automatic process, or allow for further customisation beyond what would be possible using the initial reference locations.
- the user may also change the number of body part reference locations after the object model is selected, in order to change the object model that is automatically selected.
- the adjusted object model may be saved and added to the predetermined set of object models, so that it may be used in future build preparations. This may reduce the processing time for future object customisations, wherein future users may be able to select an object model that already closely matches their specifications.
- machine learning techniques may be implemented so that the system learns from previously determined object models and applies the properties of the object model to future builds.
- the system may apply the properties to a number of identical builds so that the object may be mass produced without the need for individual approval by a user.
- machine learning may be used to learn from the positioning of body part reference points in previous object build preparations.
- a user may be able select how many, and what kind of, reference locations they want to include in the body part model data, and these may be automatically positioned. These may then be updated by a user.
- Machine learning may also enable the system to learn from previous object reference location adjustments of particular models, such as when a user updates an automatically adjusted object model, so as to be able to more efficiently adjust future object reference locations without the same amount of user input.
- Automatic adjustment of an object model may save processing time and power. For example, rather than having to build an entire object model in the pre-print application, selecting a predetermined object model based on the body part and object reference locations means that an object model that closely matches the body part may be chosen quickly.
- the use of specified object reference locations, and the adjustment of these object reference locations based on the corresponding body part reference locations may mean that the adjustment of these locations may be prioritised, and other surrounding areas of the object model may then be fitted around the reference locations accordingly. This is in contrast to having to adjust each individual point of the object model to the body part model.
- FIG. 4 shows an example of a controller 400 to generate object model data.
- the controller 400 comprises a processor 401 and a memory 402. Stored within the memory 402 are instructions 403 for generating object model data according to any of the examples described above.
- the controller 400 may be part of a computer running the instructions 403.
- the controller 400 may be part of a 3D printer which may be used to run the instructions 403 after obtaining body part model data.
- FIG. 5 shows a memory 502, which is an example of a computer readable medium storing instructions 510, 511 , 512, 513 that, when executed by a processor 500 communicably coupled to an additive manufacturing system, in this case a 3D printer 501 , causes the processor 500 to generate object model data in accordance with any of the examples described above.
- the computer readable medium 503 may be any form of storage device capable of storing executable instructions, such as a non-transient computer readable medium, for example Random Access Memory (RAM), Electrically- Erasable Programmable Read-Only Memory (EEPROM), a storage drive, an optical disc, or the like.
- RAM Random Access Memory
- EEPROM Electrically- Erasable Programmable Read-Only Memory
- references above to a body part relate to examples in which the generated object model data is adjusted on the basis of a body part, for example a body part of a user or intended user of the object.
- a body part for example a body part of a user or intended user of the object.
- references in the above description to body part model data, body part reference locations and a body part model will be understood to relate instead to the model data, reference locations and model, respectively, of such an object.
Landscapes
- Engineering & Computer Science (AREA)
- Materials Engineering (AREA)
- Chemical & Material Sciences (AREA)
- Physics & Mathematics (AREA)
- Manufacturing & Machinery (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- Architecture (AREA)
- Mechanical Engineering (AREA)
- Optics & Photonics (AREA)
Abstract
Des données de modèle de partie de corps sont obtenues, représentant une partie de corps humain et comprenant des emplacements de référence de partie de corps. Un modèle d'objet est sélectionné sur la base d'une comparaison d'emplacements de référence d'objet et d'emplacements de référence de partie de corps. Le modèle d'objet est réglé en réglant des emplacements de référence d'objet sur la base des emplacements de référence de partie de corps, et des données de modèle d'objet sont générées sur la base du modèle d'objet réglé, pour la génération par une imprimante tridimensionnelle.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/US2019/058469 WO2021086319A1 (fr) | 2019-10-29 | 2019-10-29 | Génération de données de modèle pour imprimantes tridimensionnelles |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/US2019/058469 WO2021086319A1 (fr) | 2019-10-29 | 2019-10-29 | Génération de données de modèle pour imprimantes tridimensionnelles |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021086319A1 true WO2021086319A1 (fr) | 2021-05-06 |
Family
ID=75716165
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2019/058469 WO2021086319A1 (fr) | 2019-10-29 | 2019-10-29 | Génération de données de modèle pour imprimantes tridimensionnelles |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2021086319A1 (fr) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1356709B1 (fr) * | 2000-10-06 | 2008-10-22 | Phonak Ag | Procedes et systemes de fabrication permettant une production rapide de coquilles de protheses auditives |
CN107187059A (zh) * | 2017-06-13 | 2017-09-22 | 成都智创华信科技有限公司 | 一种3d打印鼠标外壳的方法 |
WO2017163000A1 (fr) * | 2016-03-23 | 2017-09-28 | Sony Interactive Entertainment Inc. | Système d'impression 3d |
US20190146246A1 (en) * | 2013-08-22 | 2019-05-16 | Bespoke, Inc. | Method and system to create custom, user-specific eyewear |
-
2019
- 2019-10-29 WO PCT/US2019/058469 patent/WO2021086319A1/fr active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1356709B1 (fr) * | 2000-10-06 | 2008-10-22 | Phonak Ag | Procedes et systemes de fabrication permettant une production rapide de coquilles de protheses auditives |
US20190146246A1 (en) * | 2013-08-22 | 2019-05-16 | Bespoke, Inc. | Method and system to create custom, user-specific eyewear |
WO2017163000A1 (fr) * | 2016-03-23 | 2017-09-28 | Sony Interactive Entertainment Inc. | Système d'impression 3d |
CN107187059A (zh) * | 2017-06-13 | 2017-09-22 | 成都智创华信科技有限公司 | 一种3d打印鼠标外壳的方法 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6126437B2 (ja) | 画像処理装置および画像処理方法 | |
Gibson et al. | Direct digital manufacturing | |
US20200050965A1 (en) | System and method for capture and adaptive data generation for training for machine vision | |
CN104937635B (zh) | 基于模型的多假设目标追踪器 | |
JP5829371B2 (ja) | モーションキャプチャデータを使用する顔面アニメーション | |
JPH05507211A (ja) | 足の測定及び履き物のサイズを決定するシステム | |
CN105269813B (zh) | 立体打印装置的控制方法与立体打印系统 | |
US11147353B2 (en) | Apparatus and method for model reconstruction using photogrammetry | |
CN102375540B (zh) | 信息处理装置、信息处理方法 | |
US20160072986A1 (en) | Body part imaging system | |
US10582992B2 (en) | Method for determining a mapping of the contacts and/or distances between the maxillary and mandibular arches of a patient | |
JP2017120672A (ja) | 画像処理装置および画像処理方法 | |
CN111258411A (zh) | 一种用户交互方法及设备 | |
CN107492142B (zh) | 3d渲染的基于照明引导示例的风格化 | |
Oliver et al. | Towards footwear manufacturing 4.0: shoe sole robotic grasping in assembling operations | |
JP7068927B2 (ja) | 彫刻レッスンシステム、彫刻レッスン用プログラム | |
KR102322634B1 (ko) | 치아 오브젝트를 이용한 영상 정합 방법 및 장치 | |
WO2021086319A1 (fr) | Génération de données de modèle pour imprimantes tridimensionnelles | |
JP2018094396A (ja) | 服装品の光学的外観を可逆的に修正する装置および方法 | |
KR102180943B1 (ko) | 다이나믹 기능을 이용한 투명 교정 모델 설정 장치, 그 방법 및 프로그램 | |
JP7154823B2 (ja) | 情報処理装置、ロボット制御装置、情報処理方法及びプログラム | |
KR102273146B1 (ko) | 수술용 보형물 제작 방법 | |
KR20180026029A (ko) | 성형 수술 시뮬레이션 방법 | |
US20130155060A1 (en) | Method for setting a system for projecting an image onto a relief projection surface | |
CN111696177A (zh) | 人体三维模型和仿真人像动画的生成方法、装置和介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19950571 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19950571 Country of ref document: EP Kind code of ref document: A1 |