Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3544548.3581549acmconferencesArticle/Chapter ViewFull TextPublication PageschiConference Proceedingsconference-collections
research-article
Open access

An Augmented Knitting Machine for Operational Assistance and Guided Improvisation

Published: 19 April 2023 Publication History

Abstract

Computational mediation can unlock access to existing creative fabrication tools. By outfitting an otherwise purely mechanical hand-operated knitting machine with lightweight sensing capabilities, we produced a system which provides immediate feedback about the state and affordances of the underlying knitting machine. We describe our technical implementation, show modular interface applications which center the particular patterning capabilities of this kind of machine knitting, and discuss user experiences with interactive hybrid computational/mechanical systems.
Figure 1:
Figure 1: Our experiential fabrication system augments a “lower tech” manual knitting machine with computational interpretation and guidance.

1 Introduction

Computational fabrication has enabled the production of artifacts, devices, and experiences in materials and contexts that would be unthinkably complex for fully analog tools to replicate. However, even with a vast array of possible computational fabrication technologies, several challenges prevent these from being used at their full potential, such as for all the varied tasks supported by traditional fabrication.
One is that fabrication machines are often positioned as the endpoint of a Computer Aided Manufacturing (CAM) pipeline, with much of the creative process occurring in Computer Aided Design (CAD) software with no immediate reference to the material properties or even the scale of the output. This is a problem especially in personal, creative fabrication in which the creator may wish to explore more intuitively, with a more interactive understanding of what they are creating. Another is that specialized computational fabrication machines are often highly expensive and fragile, which greatly undercuts their availability for novices or for experimental tasks.
To the first challenge, we see particular potential in on-machine interfaces, which situate creative design and planning interfaces directly on the fabrication machine. These narrow the disconnect between creator and material process, with particular advantages in working with anisotropic materials [38], understanding the real-world sizes of the output [25], and supporting material experimentation [7]. To the second, we propose augmenting fabrication machines which are not already computerized. These span a wide range of media and capabilities, from woodshop and kitchen appliances to agricultural machines. While computational fabrication is often associated with full automation (that is, systems which both sense and actuate under their own power with no production-time assistance from a human user), we observe that in many cases, the major advantage of computational assistance is simply its ability to manage complexity and to bridge the user’s knowledge gaps about the fabrication process. In such cases, full automation may not be necessary, or even desired – hybrid and collaborative systems having advantages including interactivity within the creation process [5] as well as skill acquisition for the user [60]. Without an expectation of full automation, this wide range of existing tools and appliances might therefore be usefully augmented in lightweight and accessible ways.
We found a particular opportunity in manual machine knitting. We are inspired by research that demonstrates the flexibility of machine knitting – at the high end of automation, an industrial computer-controlled knitting machine might produce an entire “3D knit” object such as a sweater with no operator intervention [47]. However, manually operated knitting machines, which are often purely mechanical, are much more commonly accessible. A tremendous range of knit patterns can be produced on such machines [14] and, at their least expensive, such machines are used by casual hobbyists; mid-range machines are frequently used by students in textile design and fashion schools as well as more serious hobbyists.
However, while these machines can certainly produce fabric faster than hand-knitting on pointed needles, they do not necessarily require any less expertise. Like many well-developed but manual fabrication processes and machines, knitting machines can have complicated, inter-related mechanical settings. In fact, the relationship between the user’s operational input and the final object is arguably even even more obscured than in hand-knitting, because the most recently formed fabric is hidden from sight behind the needle beds, preventing users from getting timely feedback on their actions.
We take these challenges and opportunities of manual machine knitting to present a case study for how a “lower tech” manually operated machine can be augmented with new capabilities using lightweight sensing and simulation methods. By combining machine state tracking with domain-aware interaction modules, our system provides immediate feedback about the recent past, current, and potential future states of the machine. This 1) enables creative access to the otherwise opaque fabrication process of manual machine knitting, broadening access to machine knitting overall as a fabrication technique and 2) provides an example of on-machine interaction using the machine as an immediate and embodied input, with implications for experience of working with the machine, especially for novice users.

2 Background and Related Work

Our work contributes a thread in technical HCI on fabrication machines and particularly on direct human interaction with them. The former, broad category includes material-specific Computer Aided Design/Manufacturing systems [8, 28], tooling to improve the experience of using digital fabrication processes [2, 15, 53], frameworks for understanding existing artisanal practice with fabrication machines [62, 66], and fabrication hardware for novel or unusual materials [27, 49]. The latter, under the banner “Interactive Fabrication,” combines hardware and software approaches to support real-time, seamless interaction with computational fabrication processes such as 3D printing and laser cutting [43], for outcomes such as especially rapid prototyping [48], intuitive material manipulation contexts [44], and co-creation [32, 39]. Interactive fabrication research includes design and hardware tactics to improve user safety and reduce iteration time [43], as well as abstractions suited to digital/physical revision and iteration [33, 63] and software tooling to produce flexible, often parametric, control for real-time fabrication contexts [18, 57, 61].
As a work of Interactive Fabrication research, our system shares design goals and values with several of these systems, including an emphasis on immediate, hands-on interaction as a basis for exploration and to build intuition, as well as leveraging technical approaches to enable unique creativity contexts outside of a standard CAD/CAM pipeline. In these concerns, we are particularly aligned with a “digital craftsmachineship” lens on Interactive Fabrication [9], in which a human’s experience of crafting is an important locus of meaning in addition to the tangible output, and in which hands-on labor is seen as an opportunity, not a drawback [5]
Our work diverges from typical Interactive Fabrication work because the underlying fabrication machine was always intended to be operated by hand, and it is not electronic or even electrical. Indeed, the system itself produces no physical output but instead focuses on enabling the human user to do so. Because of this, some concerns from typical Interactive Fabrication research, like reduced iteration time and imposing additional safety features, are less relevant to our system.

2.1 On-Machine Interaction

Within Interactive Fabrication, methods for locating a user interface directly on the fabrication hardware, such as via projections or Augmented Reality headsets, have been proposed as a way of helping users understand a fabrication machine’s output in real-world dimensions [25, 48] and situate irregular materials directly into the context of fabrication [7, 38], and as a way of exploring context-specific CAD paradigms that do not translate as well to a screen [50]. Research in this area has particularly target unique fabrication techniques like formless heat-molding [44] for which a traditional CAD pipeline would be inadequate, as well as techniques with deep histories of expert use, such as woodcarving [67] and machining with a lathe [60].
We are inspired by the range of this work as well as its recognition of the strengths of hands-on, experiential fabrication. We similarly focus on a somewhat unusual domain and hope to highlight the possibilities for other fabrication paradigms that may be overlooked in HCI. Our work specifically proposes drawing on an existing, mature fabrication technology, with implications for the vast number of other such technologies currently in use.

2.2 Augmentation for Interactivity

In this work, we are exploring adding computational capabilities to a not-otherwise-computerized machine. This allows us to draw on a wealth of existing fabrication technologies and communities of practice, but integrating with existing mechanisms – especially complex ones, like knitting machines – can require strategic attention. Augmented reality research has long shown how helpful information can be overlaid onto physical surfaces [64], boosting users’ ability to navigate complex, domain-specific, and critical tasks [54]. Like our work, the “Drill Sergeant” [52] and “Adroid” [59] systems augment a manual fabrication task with real-time guidance. Our system broadens the focus from helping a user complete a specific task to encouraging the user’s understanding and confident agency with the underlying machine. As such, we foreground interpretation of the recent past and current state of the machine alongside suggestions of potential near-futures as flexible modules for a variety of modes of use.

2.3 Machine Knitting

Machine knitting has received attention recently in HCI, with research predominantly focused on fully automatic computational knitting, which has been used as a tool to generate databases of knit material properties [26], and to create complex technical materials for wearable [34, 40] and/or robotic contexts [3, 4],[35]. Design tools targeting fully automatic computational knitting have been studied in HCI as well as within the computer graphics research community, and approaches span from low-level compilation [42] and generating patterns from 3D models [47],[29, 45] to knit design interfaces [31] drawing on sewing pattern notations to simplify user interaction.
While our system does support outputting a record of the user’s knitting via the Knitout file format that was initially developed as a target for general-purpose knitting compilation [41], the needs of our system diverge greatly from these tools in several major aspects: 1) we needed to encapsulate and present to the user a very different set of knittability constraints than fully automatic machines support; 2) the output of our system is primarily not the Knitout record, but rather guidance to the human user; 3) our system is intended to be used interactively as knitting progresses. A more closely related work is the eLoominate system, which is a purpose-build peg knitting loom – a type of jig often used for teaching hand-knitting – with LED indicators to guide users through simple two-color patterns designed in advance on an accompanying application [22].
Less-automatic machines have primarily been explored on the hardware side. All Yarns Are Beautiful (AYAB) [10] is an open-source project which documents an Arduino-based replacement controller for the 1980’s Brother ElectroKnit series of computerized home knitting machines – these machines are manually operated (non-motorized), but the computer controls specific patterning (typically used for two-color “pixel art”-style patterning). Another open-source project, OpenKnit [20], seeks to make it possible for a hobbyist to build a knitting machine from off-the-shelf and 3D printed parts. Depending on the build, an OpenKnit machine can be fully manual or mostly automatic. (OpenKnit is now a hardware startup, Kniterate [21], which makes fully-automatic machines.) AYAB and OpenKnit are both long-running projects with active communities (as of this writing, the OpenKnit Instructable has over 111,000 views [19], and the AYAB discussion group on Ravelry has 362 members [17]); other, smaller projects include small-run specialty tools for automating color changes [56] and repeating patterns across the width of a knit [36]. These hobbyist-led innovations have supported interactive art [51] as well as experimental architecture research [11]. Together, these projects highlight both a community interest in manual machine knitting as well as opportunities for creative practice.

3 Manual Machine-knitting

The variety and functionality of knitting machines have been documented by textiles industry experts [55] as well as within fabrication research [30, 46]. We provide an overview of terminology to give context for how our system augments a typical machine. As with the industrial computer-controlled knitting machines highlighted as general-purpose fabrication devices [4, 42], manual knitting machines form fabric on hook-shaped needles. These needles are arranged in parallel in individual slots on beds. The simplest manual knitting machines might have just one of these beds, in which case the needles run parallel to the floor, with the hook end of each needle facing the user. A more-complex machine would have two beds, arranged in the same inverted “v” as a computer-controlled machine. (In a hobbyist machine, the second bed might be sold as an optional attachment, and referred to as a “ribber.”)
Figure 2:
Figure 2: (a) The overall layout of a v-bed manual knitting machine, showing the carriage (Image modified from Wikimedia Commons, [16]). To form a new loop, a machine needle is moved forward to grab a bit of yarn, then moved back to drop the previous loop over the new one (image modified with permission from [6]). (b) A view of the needle bed and one carrier on our machine.
The carriage is the main point of contact for the user. The carriage has two main roles. First, it controls the yarn carriers which position yarn in front of each needle; these may be directly integrated into the carriage body, or, as in the machine used in this work, the carriage may be able to selectively engage separate but passive carriers for multi-color knitting. (Industrial computer-controlled knitting machines coordinate the carriage and carriers either with selective mechanical engagement or with electronic synchronization.)
Second, the carriage contains a set of cams which, when the carriage is slid across the needle bed, push the needles up and down along their slots to carry out the operations of knitting. These operations include the eponymous “knit” operation, in which the needle is pushed forward to catch the yarn from a carrier, then pushed back down to pull a loop of the yarn through any previous loops on that needle, dropping those previous loops in the process (Figure 2). The other most typical operation is a “tuck,” which also grabs a loop of yarn from the carrier, but does not pull it through existing loops, instead incrementing the number of loops on that needle. A group of controls on the carriage configures the pattern of stitches carried out across a row. All together, these operations determine the loop-to-loop connections of the knitting, with effects in the knit surface’s stretchiness, density, and surface patterning.
To operate a manual knitting machine, the user must push the carriage across the needle bed for each row, alternating leftward and rightward passes. This action can require up to 15 lbs of force, and knitters typically stand at the machine to operate it.
The knitter can also transfer stitches (move them from one needle to another), but this is not an automatic operation as it is in industrial knitting; it must be done by hand, in an operation that takes some skill to perform quickly or reliably. (Nonetheless, these hand-manipulated stitches can greatly increase the repertoire of a manual machine-knitter [24].)

4 Patterning Affordances of Manual Machine Knitting

Our Dubied NHF4, like other manual v-bed weft knitting machines, is functionally very similar to machines made by the same company in the early 1900s, and fairly similar to its predecessors from the mid-1800’s. Like many other mature machine technologies, its operation interface is constructed for reliability and relative power of expression, not for legibility or ease of use. We summarize the operation and patterning affordances in this section to form some basis for understanding our interface augmentations, as well as to underscore the difficulty of learning to use a manual knitting machine unassisted.
The details in this section describe machines with two beds, directional knit/tuck cams (“cardigan cams”), and high/low needle selection. This set of features is very representative of pre-computational industrial machines, educational models throughout the twentieth century, and contemporary industrial-style manual models such as Flying Tiger HK-type. Some similar machines have a different number of needle selection types or a subset of these capabilities, such as non-directional cams.
While operating the machine, a knitter can adjust the various cam settings of the carriage, as well as engage different yarn carriers. When the knitter moves the carriage across the needle bed for each row of knitting, the carriage cams guide the needles up and down in their individual slots for knitting and tucking. The cam settings for a given pass of the carriage will select which needles will knit, which will tuck, and which will be missed (passed by without being actuated). On knitting machines intended to support tubular knitting, as ours is, these carriage cam settings can be allocated independently for each direction of pass, per bed. Therefore, the knobs are repeated for each of leftward on the front bed, rightward on the front bed, leftward on the back bed, and rightward on the back bed.
However, within a bed, this selection is based on needle types. On our Dubied NHF4, there are two types of needles: “high” needles and “low” needles, referring to the distance the needle’s selector tab protrudes from the surface of the bed. The choice of which type of needle should be in each slot on the bed is established before knitting begins, and retained throughout the knit job. (Individual needles may additionally be put “out of work” – that is, set to a position where the cams don’t actuate it, regardless of their settings – during knitting. Because this can create problems, such as jamming when a yarn loop prevents the needle from being taken fully out of work, our system never suggests it. None of the knit fabrics shown in this work required changing needle allocation during knitting.).
Figure 3:
Figure 3: (a) A bird’s-eye diagram of the carriage shows that there are four sets of switches which can be divided into front and back bed and rightward and leftward directions. Within each bed/direction set, there are two switches – “knit/tuck” and “selection,” each with three possible positions. The “selection” switch rotates on the face of the carriage, and “knit/tuck” is a rocker switch. (b) The front rightward switches, labeled.
Table 1:
Switch SettingResulting Operation on Needle Type
SelectionKnit/TuckHigh NeedleLow Needle
allknitknitknit
allk/tknittuck
alltucktucktuck
highknitknitmiss
highk/tknitmiss
hightucktuckmiss
noneknitmissmiss
nonek/tmissmiss
nonetuckmissmiss
Table 1: Operations from Switch Settings
For each bed-direction, the cam settings are presented as a set of two switches (Figure 3) which alter the selection cams’ proximity to the needle bed. If a cam is brought close to the bed it will catch and actuate all of the needles (both high and low). It might otherwise be brought away from the bed to miss all of the needles, or it might be adjusted to a distance where it actuates just the “high” ones without the “low.” Therefore the three positions of the selection switch are: “select all needles,” “select high needles only,” and “select no needles”; the three positions of the knit/tuck switch are “knit all selected needles,” “knit the high and tuck the low, assuming low are selected,” (abbreviated as “k/t”) and “tuck all selected needles.”
The full range of how these switch settings interact with the two needle selection sets is summarized in Table 1.
Because these settings are allocated per bed-direction, a basic two-row-long sequence of operations can be performed without changing settings. For example, the front bed carriage might have its leftward settings be “knit all” and its righward settings be “knit none,” with the back bed set to “knit none” needles leftward and “knit all” rightward. The result of this would be a tubular knit, in which the knitting proceeds in a spiral, leftwards on the front bed and rightwards on the back, without the knitter needing to change settings between rows.
Lastly, the machine’s rack lever changes the alignment between the front and back beds. At the neutral position, the two beds are aligned with each back bed needle almost directly across from its corresponding front bed needle, and it can be adjusted rightward or leftward by three needle-widths in each direction. In fully automatic machine knitting, the rack alignment is primarily used in conjunction with “transfer” operations to move stitches around [42]. On our machine, transferring is done manually and does not depend on changes in bed alignment; however, the rack lever enables a unique category of knitting patterns known as “racked rib.” In these, the rack position is changed between knitting passes of a fabric formed on both beds per row, as in a “rib” (alternating front and back knits) or “cardigan” (a lofty fabric in which each row knits on one bed and tucks on the other). The changes in rack position entangle the columns of stitches, producing fabrics with puckers, tight zigs-zags, or meandering waves.
Together, this system is fairly powerful, enabling knitting a variety of structures such as tubes, ribbing, and cardigan without frequent settings changes on the part of the knitter. However, it is also highly nonintuitive for a beginner. The potential for frustrating accidents, such as causing tension problems by tucking or missing the same needle too many passes in a row, is high, and recovering from such errors can involve painstakingly picking yarn out of needles and re-starting the entire knit piece. To make matters worse, the newly-formed stitches hang between the two opaque metal beds of a v-bed machine and are thus not even visible to the knitter until many rows later. In the case of “racked rib” patterning, the resulting fabric can be quite complex and difficult to visualize; additionally, this technique is rare in hand knitting (where there are no “beds”), so it is likely to be an unfamiliar category even to users with a hand-knitting background.
Figure 4:
Figure 4: Our system combines hardware and computer vision as input to drive a machine simulation and other interaction modules.

5 Implementing an Augmented Knitting Machine

In order to smooth the complex process of manual machine-knitting, our machine augmentations track the machine’s state and provide interpretation, visualization, and suggestions for the knitter.
We choose an on-machine approach to complement the necessarily hands-on process of knitting with the machine, as well as to provide immediacy. To allow the knitter’s focus to remain on the machine, our system uses the machine itself as the input: the system tracks the physical cam and rack settings of the machine, and it uses changes in the carriage position to determine when rows have been knit. These changes are reflected in a visual display which is positioned immediately above the needle bed. The display shows the current state of the machine, including recently-knit rows which are not visible yet on the actual machine, as well as optional additional modules such as patterning guidance. We diagram the technical implementation of our system in Figure 4.

5.1 Sensing

We chose lightweight methods to capture the machine’s settings at a given time. In designing the sensing method, we prioritized flexible approaches which would be straightforward to deploy and could be modified to suit other similar machines. Additionally, because we are augmenting an existing vintage machine, which is itself a lovely and complete artifact, we made only reversible changes to the machine.
To capture the racking position, we mounted a simple 3-axis accelerometer (GY-61 ADXL335) to the racking lever at the side of the machine (Figure 6, left). To sense carriage position, we mounted Hall effect sensors at the left and right sides of the machine to be triggered by magnets attached to the carriage. These sensors are mounted on rails with binder clips and are positioned to be just outside the knitting area for a given task (e.g. for a narrower fabric, they can be brought closer to the center of the machine). We sense the left and right positions separately to support “leaving one position but not yet arriving at the other” as an input gesture. We use an Arduino to debounce these hardware sensor inputs and send change event notifications over USB serial.
Because the cam switches are mechanically complex and somewhat numerous, we decided against hardware sensing for their positions. Instead, we used computer vision: we mounted two webcams to the bow of the carriage (Figure 6, right), with one each pointed to the front and back carriages (Figure 5). During system use, a Processing sketch captures data within calibrated crop areas of the webcams.
For image classification, we used TensorFlow.js [1] with a separate model for each of the four switch sets (front leftward and rightward, and back leftward and rightward). Each model has ten classes: one for each of nine setting combinations (as listed in Table 1), plus one for “hands visible in the image,” to minimize updating the switch position display while the knitter is in the middle of adjusting a switch. We captured approximately 320 images of each switch set position group (e.g. “tuck on all needles, for back bed rightward rows”) using a second Processing sketch to manage the webcams and organize the data for each class. During image capture, we stored webcam input slightly outside the calibrated crop areas so that we could later augment our image data with randomly-chosen sub-crops at the final image size. This process took approximately a half hour. We manually sorted out images with hands visible into a separate “hand” class for each switch set, then augmented the approximately 250 images remaining in each other class with randomized crops, blurring, and image contrast to a total of approximately 1600 images per knob set. Using a basic Keras model on a personal computer with an RTX 3070 GPU, we trained a three layer convolutional neural network with 1.6 million parameters. Training took thirty seconds per model, and reported 99.32% accuracy when reserving 20% of input images as validation data. We did not fine-tune our approach beyond what was suggested in an online tutorial [58], suggesting that comparable results do not require particular machine learning expertise.
We coordinate these sensors with a server written in Node which accepts the carriage and rack change events from the Arduino as well as image data from the Processing webcam sketch, uses Tensorflow.js to classify the image data, and passes machine state events to the frontend user interface over a websocket.
Figure 5:
Figure 5: The views from the two webcams. Each camera captures the two dial sets belonging to one side of the carriage.
Figure 6:
Figure 6: Our system’s hardware modifications to the machine are all removeable with no damage to the underlying machine. Left: rack lever position is sensed with a three-axis accelerometer. Center: a pair of webcams is mounted to the “bow” that connects the front bed and back bed carriages. Each camera is positioned to capture the switch positions for its carriage. Right: Hall effect sensors are mounted to the rails that are intended for use with a mechanical row-counter. Magnets attached near the handles of the carriage pass over the sensors at the end of each row.
Figure 7:
Figure 7: Because we track the machine operations, we can “replay” them on any Knitout-compatible machine. In this case, we have knit a duplicate scarf on a Shima Seiki SWG091N2, which has a much smaller stitch size than our system’s machine.

5.2 Machine model

We use a computational model of knitting machine state to track knitting progress. This model includes carriage position, yarn carriers, bed rack position, and a graph representation of the knit fabric being formed; together, these factors capture the current and past state of knitting, and can be used to project possible futures. These are particularly useful in visualizing the recent past (that is, the knitting which is currently hidden between the needle beds) and generating warnings about undesired next moves.
In addition to tracking direct operator actions, our underlying machine model retains a low-level history in the Knitout knitting machine operation language spec [41], allowing knit structures to be “replayed” on any Knitout-compatible computer-controlled knitting machine (Figure 7). This could allow a knitter to design interactively, then use an automated knitting machine to create multiple duplicates, or to knit at a different stitch size. On top of the basic needle-by-needle abstraction of Knitout, we model the carriage cam settings and needle types (“high” or “low”). Lastly, we maintain both 1) committed machine states, representing operations the knitter has already taken, and 2) potential machine states, representing possible futures given changes in the machine settings.
Figure 8:
Figure 8: A screenshot of our basic machine visualization. On the left, a diagrammatic rendering of the carriage shows the machine’s current switch and rack settings. On the right, a mass-spring simulation shows the fabric that is being formed. The rows of knitting that are tinted yellow are a projection based on the current machine settings.

5.3 Visualization

To communicate machine state to the knitter, our front end system comprises several visualization modules which are written as interoperable JavaScript classes. (These can also can be used as input devices themselves – while not part of the main on-machine interface scope of this work, this capability does allow a user to practice knitting virtually.)
We render carriage, rack, and yarn settings diagrammatically, with textual labels for the switch settings, Figure 8. When the user changes a cam or rack setting on the physical machine, this view is automatically updated to indicated the current settings. We render the machine as a simplified needle bed, with the needles aligned according to the current rack position and a symbol on each needle showing which operation would be applied at that needle if a row were made with the current settings.
The in-progress knitting is visualized using a mass-spring simulation, with the back bed yarn connections shaded slightly darker than the front. We abstract the stitch connections in the fabric into simple nodes and edges, instead of showing a literal yarn path, for readability. (We chose the mass-spring simulation for its particular suitability in showing how columns of stitches deflect in the “racked” patterning we highlight in section 6.3.) In this view, the rows that have already been knit are displayed in a yarn color, and future row predictions/suggestions are tinted yellow.
We also created a sequential panel representation of our “pattern rows” notation. Each panel shows the cam and rack settings, carriage direction, and yarn carrier needed to reproduce a particular row. When displayed as part of live instruction set, each panel highlights the changes the knitter would need to make to follow that instruction.
These machine state and instruction panel views form the visual basis of the patterning interface modules we describe in Section 6.

5.4 Error checking

Because we model the fabric being formed and its relation to the machine, we can add error checking for problematic operations. For example, in Figure 8, the interface shows an error that “Needles on back bed would have 3 loops!” Needles can only hold so many loops, so when additional loops are added by successive tucking operations without intervening knit operations, the knitter runs a risk of overloading the needle, leading to dropped or torn loops. Because we model each needle with a reference to the loops in the
The underlying model can provide warning for certain conditions in either the committed or projected machine states. For example, the needle bed model can detect when many consecutive “tuck” operations are performed at the same needle – needles can only hold so many loops, and this can interfere with proper fabric takedown.

6 Interface Modules

Together, the previous sections described the technical underpinnings necessary to sense, track, and display the state of the machine and any in-progress knitting. We used these capabilities as component parts of three modules designed for different modes of engagement with the system: novice learning, creating functional fabrics, functional patterns, and designing improvisationally.

6.1 Basic Operational Assistance

First, we created a view that provides an interpretation for the knitter of the interconnected machine settings and their effects on the next rows to be knit. In this view, the diagram of the cam and rack settings is shown live alongside a simulation of the existing fabric and a preview of what the next two rows of knitting would look like with the current settings. The cam settings are labeled with the name of their position (“all”/“some”/“none” and “knit”/“kt”/“tuck”), and the diagrammatic view of the needles displays the operations as the would occur in the next pass at the current settings. When the knitter changes a cam or rack setting, these views update accordingly. The fabric display shows the recent rows that are still hidden from physical view behind the machine beds. Lastly, this module displays error checking messages to warn the knitter about potentially risky operations they have performed or would perform. This module therefore collects and displays information about the recent past, present, and potential near future states of the machine and fabric, giving the knitter information but not imposing any particular guidance.

6.2 Production Assistance for Function Integration: Pockets

Our second module is intended to help a knitter produce a specific outcome. We focused on producing fabrics with two-layer “tubular knit” areas, which could be used as open pockets or as closed regions to contain other materials. This knitting style requires the user to plan the locations of High and Low needles, and to change cam settings at the beginning and end of the pocket section. If the user wants to knit a pocket which is open on one side of the knit, they will additionally need to switch cam settings every other row, even within the pocket section.
Figure 9:
Figure 9: To produce open pockets, the knitter must switch the carriage cam settings every other row, and the rack setting every row. Our system helps the knitter keep track of these.
Our “Pockets” module provides a simple sketch-like interface to plan pocket locations, Figure 10. During knitting, it shows the knitter’s progress through the plan and provides row-by-row cam setting guidance.
Figure 10:
Figure 10: The pocket-knitting interface. The left side panel shows the current machine settings. The center panel is an editable area in which the knitter lays out pockets. The right panel contains a scrolling sequence of instructions, with the next instruction magnified. If the knitter needs to change a machine setting, the instruction panel will highlight the needed changes with orange arrows.
Figure 11:
Figure 11: The two-layer area of the knit can be fully closed, and items can be embedded inside by inserting them just before the top row of the pocket area. Unlike in fully automated knitting, embedded items can be relatively large and fragile. Here, an LED backlight panel is embedded in a hat.
In Figure 11, we showcase an advantage of manual knitting machines over industrial knitting: a greater range of possibilities for integrating additional materials into the knit. (This would be dangerous and difficult with a high-gauge, fast-moving, delicate industrial machine.) In particular, items slimmer than the gap between the beds (6mm at knit time, which can be temporarily increased to 12mm while knitting is paused) can be embedded in the fabric by designing a closed pocket and inserting the object just before the end of pocket knitting.

6.3 Creativity Assistance: Paths of Improvisation

Our third module targets open-ended exploration with a greater depth of complexity than the first module. Because of the necessary presence of the knitter, manually-operated machine knitting presents a great opportunity for real-time creativity. However, the effects of particular cam setting choices can take a few rows to become clear, and a beginner may not have much basis for understanding their range of options. With the additional complication that recently-knit rows aren’t even visible to the knitter yet, the knitter might not have enough information to make improvisational choices.
To show how the knitter’s understanding of complex possible outcomes could be supported, we produced an interface module which generates and simulates a set of “path options” for the knitter to consider pursuing. Each path generates its instructions using its own sequence generation algorithm, and it is displayed as a sliding sequence of instruction panels alongside a fabric simulation with the hypothetical stitches that would be generated by that path highlighted in yellow. As with the Pockets interface, instruction panels show a live view of which settings the knitter needs to change to pursue that instruction. As knitting progresses, the set of path options is updated accordingly. Paths whose “next step” corresponded to the action just taken by the knitter are advanced to show the following step; paths which did not include that action are recomputed starting from the new step.
Figure 12:
Figure 12: Our “Path Options” module shows three possible future outcomes based on different algorithmic tactics. For each path, a sliding window of instructions is shown alongside a preview of what the fabric would look like if that path were followed.
The path options module is written to be flexible and extensible with respect to which generative algorithms are used. We implemented three:
(1)
A “racked rib” path generator, which proposes either “rib” or “cardigan” cam settings (based on similarity to the current settings) and then modulates the per-row suggested rack position according to a wave function, stepping up and down by one rack position per row to hit the full range of positions.
(2)
A Markov chain path generator, which derives suggestions based on past rows the knitter has made (with some initial seeding of basic row types).
(3)
A “best match” path generator, which attempts to match recent knitting sequence to one of a list of named fabric types. This list was derived from a swatchbook assembled by Stoll (a manufacturer of knitting machines), and it includes stitch patterns like full and half cardigan, full and half milano, and tubular knitting.

7 Improvisation by Novice Users

To gain insight about how our system could support learning and ultimately a creative practice, we introduced seven new users to the system.

7.0.1 Research questions.

We aimed to study 1) basic usability: whether participants could understand the annotations and use them to reason about machine operation; 2) improvisational usability: whether the system sufficiently scaffolded real-time decision-making; and 3) overall participant attitudes toward hand fabrication, computational mediation, and improvisational practice, both in their own work and as they experienced these aspects of our system. The first two questions are assessments of our specific technical system, while the last question relates to the broader possibilities for augmented manual machines and exploratory use of interactive fabrication.

7.0.2 Participants.

To avoid biasing the results on basic usability, we recruited participants with no machine knitting background, and no or minimal hand-knitting experience. For safety reasons, and to mitigate novelty effects from interacting with computational creation overall, we required experience in other computational production systems: six had 3D printed and/or laser-cut, and the remaining one has used computational systems for creative image generation. In order to meet these qualifications, and in accordance with covid-related limitations on visitors, we recruited participants within our department, or family members of department members, who were not textiles researchers. Our participants ranged in age from 20 to approximately 40.

7.0.3 Procedure.

For each session: after asking the user to practice moving the carriage, we introduced the basic “interpretation” view (Figure 8) and gave a verbal explanation of the carriage settings. The user was encouraged to interact with the settings and knit as many new rows as they liked until they were “ready to learn another capability,” at which point we introduced the racking lever. The user was given the option to view a swatch of several “named” patterns (rib, tube, cardigan, half-milano, and a mock interlock structure) along with a paper printout of instructions for how to knit each. Finally, we introduced the “suggested paths” view (Figure 12) and again encouraged each user to interact for as long as they liked with the system. In all, users spent approximately an hour each interacting with the system. After this, we conducted a semi-structured interview with each user, focusing on their experience of the system, how it compared to past fabrication experiences (both computational and manual), and their creative decision-making throughout their process. While our interview was semi-structured, we asked each participant at least the following questions:
Please tell me about past creative fabrication experiences you’ve had, especially either involving textiles or digitally-mediated fabrication?
How did this experience compare to those?
Please tell me about what you made.
Please tell me about creative decisions you made during the fabrication process. (Interviewer may recall a specific instance if the participant used "thinking aloud" during the workshop)
Were you able to explore the possibilities you wanted to explore?
Given more time, what additional things would you like to try?

7.0.4 Analysis.

We recorded the audio from our interviews, photographs of knit artifacts, and time-stamped system logs. The system log data includes all user actions perceived by the system, such as changing a cam or rack setting and moving the carriage. (Note that this data is messy, because it is not debounced e.g. to remove moments when the classification system mis-categorized a cam setting – because categorization is done many times per second and is generally accurate, these only appear as brief flickers to a user, but would be recorded as “changes” in the system.)
To assess basic usability, we viewed the artifacts and system logs. In the artifacts, we starting by looking for egregious knitting errors of the type which our error-checking (Section 5.4) was designed to help avoid. We found that all participants did successfully avoid these errors. While we did not deliberately include a comparison case with error-checking turned off, several of our participants discovered another, comparably predictable common error which we had not built checks for. These participants each encountered the same problem multiple times, suggesting that it was a difficult problem to avoid without tool assistance.
We found that users made many more one-off setting changes within the first part of their knitting, including much more changing of cam/rack settings without moving the carriage, to see the effect of these without committing it to the knit. This implies a process of initially gaining literacy with the system.
To assess improvisational usability and participants attitudes about computationally-mediated hand fabrication, we analyzed the interviews. One author performed a reflexive thematic analysis [12] by segmenting the interview transcripts and producing first highlights, then initial codes in a spreadsheet, then performing an iterative bottom-up coding. Because our questions largely centered on the experience of fabrication, our analysis is constructionist. We organize our observations of participant experiences and attitudes into themes in the following subsections.
Figure 13:
Figure 13: P7’s swatch, showing a progression from row-by-row experiments through named fabric types, including the racked cardigan which requires per-row rack changes.

7.1 Scaffolded Learning

Our participants were novices to both knitting in general and machine knitting in particular. Most participants described their learning as initially undirected, and they expressed that the system made it possible to manipulate the machine without needing to first form a complete understanding of its operation. Indeed, participants described being able to operate the machine before understanding much at all: “This isn’t something I’d typically do and it’s nice to have something like this where I can just kind of jump in and I am very confused about a lot of things but eventually I will pick it up. With the help of the computer [...] I get a more intuitive sense as to what is happening under the surface as opposed to needing to be explained every little part of what’s happening.” (P6)
Similarly, P3 mentioned an initial period of knitting to get accustomed to the machine, before branching out: “It took me a little bit to get comfortable just going back and forth, but once I started being able to see what was happening, it was like, ’Oh, I can change stuff up.”’
Depending on their goals, a knitter might find these modules to have too low of a learning ceiling. P2, who was mostly interested in gaining and refining a mental model of how the machine worked, expressed concern that they might not truly be learning and summarized their interaction with the Paths module as “Well, I’m kind of just following instructions.” (We discuss this possible negative outcome in section 8.3.)
However, other participants balanced their priorities between gaining a deeper understanding and generating an interesting artifact (in P1’s words: “I don’t really like to feel like I’m making garbage”). P6 enjoyed the system because “it was nice to see that I could put something together relatively easily and have some sort of guidance [...] and actually make something that looks like it was designed with purpose and intention,” implicitly regardless of whether it was their own purpose and intention.

7.2 Interaction with Hybrid Processes

Participants touched on feelings of “stress” (P1), “confidence” (P1, P7), “trust” (P1, P2, P6), and “self-reliance” (P6) to describe how they viewed their relationship to the system over the course of their session. P1 described the interpretation assistance as a kind of “re-assurance” and an “encouragement.”
In relation to how they thought of fully-automatic systems, they remarked on the relative power and also responsibility of hybrid interactive systems. Despite the usual premise that fully-automated systems aim to be reliable and predictable, every participant with computational fabrication experience mentioned that, when a problem does occasionally arise, the user typically doesn’t know until after it occurs. P2 compared using a fully automatic machine to “the handoff that happens [when you] give a plan or geometry to a secondary fabricator and trust that happens.” P6 gave a longer explanation that was also suggested by P1, P2, P3, and P7: "Since I’m physically at the system the whole time, working with it in this hybrid approach, it’s much easier to avoid any issues that might come up. With a 3d printer, with a lot of automated fabrication, there’s a kind of expectation that, well it’s automated for a reason; I don’t need to necessarily watch too much, within reason. [...]That is not always the case. Even printers that are industry standard sometimes can just have wild things happen to them. Things can go wrong and that is definitely something that is not likely going to happen with this hybrid approach. One, because it’s telling me where things might go wrong, and two because I am constantly there at the machine [...] For example if I’m pulling the machine across and I feel all the resistance building up that’s a pretty good indication that something is going wrong and I should be careful."
The benefits and drawbacks of interactivity were summarized by P4: “If I just give something to a printer, the output is predictable all the time. But the thing is, if I play with something like this, I have the control. [...] So I have the rights to make a mistake as well [...] If I play something with my hands, putting more effort on it, I feel like I did something really by myself.”

7.3 Embodied Knowledge in Manual Machine Processes

In addition to the complex interpretive expertise of understanding the machine’s settings and operations, a manual machine knitter must learn the haptic and auditory cues of successful operation. Every participant remarked on gaining this knowledge over the course of the session. For example, from P6: “Knowing how hard to push – I would say it definitely faded back into my subconscious by the end.” And from P2: “even if you’re following [the guided improvisation module], at the start there is a lot of experiencing the difficulty in the the haptics and understanding what feels right, and not, and the sort of rhythm you get into with switching the gear. Even if you’re not thinking about all those switches, you’re building that physical memory of the interaction with the machine how everything should feel and sound.”
This embodied experience of using the manual machine was generally seen as a positive. In comparison to the fully automatic process of using a 3D printer, P6 said “Assuming in a perfect world that your [3D printer] is going to work well, you can just walk away from it and come back later once it’s done. But [having to physically operate the machine] isn’t always necessarily a bad thing in my mind. [...] I think there is an aspect to it, sometimes you just really want to zone in on one thing and make sure you’re doing that one thing really well.”
All participants at some point in their conversation made a full-body “moving the carriage with both arms” motion, and P5 did so with an onomatopoetic “shunk” sound as well. P3 made the gesture while saying “I was having fun with the process once I got more of a handle on it,” and later summarized the experience with “there’s a lot of satisfaction to it.”
The hands-on aspect of the process also prompted feelings of pride, or ownership. P6 was very enthusiastic about the aspect of handcraft in our system: “I think that there is something really really special about being able to make something... I say ‘by hand’ – I’m putting some giant air quotes around that because it’s using the machine – but, you know, something that you crafted yourself.”

8 Discussion

8.1 On-Machine Interaction for Experiential Fabrication

Drawing on related work in Interactive Fabrication, we proposed that on-machine interaction is especially suitable for contexts in which a hands-on experience is desirable, such as for personal or context-specific fabrication. With a manual knitting machine, the hands-on labor is not optional; however, this does not necessarily make it less desirable. Instead of being an unfortunate drawback of a less-expensive system, the need to physically operate the machine can be a creative opportunity [5]. Participants connected hands-on production to ideas of labor as a locus of value, for example suggesting that they might use the system to make nice gifts for loved ones which would be more valuable that something store-bought or fabricated automatically (P2, P6). In addition to the perceived social value of the artifact itself, hands-on systems can mediate valuable experiences – for example, an experience of learning [60] or of agentive collaboration with a machine [9].
In building our system, we made several design decisions to emphasize the experience of knitting. We positioned the machine itself as the only input – technically, the front-end interfaces could also be used with mouse clicks, for debugging or for explaining machine operation to someone without their own machine, but we typically deployed the system without either keyboard or mouse visible. We arranged the computer screen physically very close to the bed, to allow quick glances between the two and to reinforce the interaction metaphor of the machine itself as interface. (Future iterations of this work could conceivably close the distances even further with either projected imagery or with an Augmented Reality headset.) This closeness underpins a sense of immediacy, a private collaboration, between user and fabrication machine.
Additionally, participants, as well as the authors of this paper and anecdotally numerous lab visitors, have found the physical sensation of manual machine-knitting delightful. The auditory and haptic cues, along with the smoothly repetitive motion and feelings of control over a complex mechanism, add up to a uniquely satisfying experience.

8.2 Augmentation as a Way to Leverage Existing Machines

We also proposed that augmenting manual machines could increase the availability of machine processes to creators beyond those with the ability to purchase and maintain expensive automated machines. Researchers can lower financial barriers to computational fabrication through engineering new, lower-cost mechanical systems [7] or by using software approaches to squeeze latent functionality out of existing low-cost systems [23, 37]. Our participants mentioned that the system allowed them to find value in a machine that they may not have otherwise interacted with, either because it was intimidating or because, as practitioners of computational fabrication, they found the idea of purely mechanical machines boring. While we do not share this latter opinion and are not of the belief that our system inherently “elevates” the knitting machine, we do see this as evidence that the system widens access, bringing new attention to a mature and fascinating fabrication machine.
Augmentation does not need to destroy or subjugate the underlying machine. For both pragmatic and conceptual reasons, we found it important to choose entirely reversible hardware interventions, and we designed our modular software systems to offer flexible amounts of support.

8.3 Overreliance on Computational Guidance

A drawback of computational tools is that they can “water down” or de-skill production processes: if a user is simply enacting system instructions, they lose creative agency. This concern has become particularly topical as increasing use of machine learning techniques in creativity support has spurred a new wave of discourse on the relative roles of creators and computational systems.
Because we view machine augmentation as a possible way to scaffold learning, the idea that a creator could over-rely on a computational system to the detriment of developing their own intuition is concerning. Indeed, one participant mentioned exactly this concern. (See section 7.1.) While each participant’s engagement with our system was too brief to produce deep expertise, we did observe that participants did not rely uniformly on each computational aspect of the system. They did lean heavily on basic usability assistance like error checking, which was explained by their fear of breaking the system (P2, P4, P6, P7). However, they followed higher-level suggestions (in the “Paths” interface) much less strictly. This implies that they were able to view these appropriately as suggestions, which they had more agency to reject.

9 Future Work

We discussed two research areas this work contributes to: on-machine interfaces and augmenting existing machines. The challenge of doing these simultaneously is that the system must be adaptable to a specific, possibly vintage or otherwise non-normative machine. In the case of the system documented in this work, as we stated in Section 4, the underlying Dubied knitting machine we used is very typical of industrial-style v-bed knitting machines; while some have a different number of needle types and/or a subset of these cam settings, our machine model (Section 5.2) can be easily configured to these differences. Consumer single-bed machines typically have a different style of needle selection, but our model could be extended to cover this as well. A trickier proposition is adapting the hardware, such as the camera mount which we constructed to attach to the mounting hole intended for an auto yarn-changing mechanism – while this is likely to be standard for Dubied machines of a similar era and onward, it is much less likely to be immediately portable to another brand. Similar situations exist in many other manual fabrication machines, such as machine shop tools, kitchen appliances, and sewing machines: while the basic mechanism of a given type of machine are well-established, the specific form of the tool may vary widely. To solve this problem, future work in this area could draw on research in “upcycling” [65] and adaptability [13] to generalize how disparate machines can best be outfitted with various categories of sensing.
More broadly, discovering other opportunities for computational mediation requires investigation into fabrication domains outside of those already embraced by HCI.

10 Conclusion

We described our lightweight approach to equipping an existing mechanical fabrication tool with sensing and visualization, and we showed production applications which center the particular patterning capabilities of this kind of machine knitting, including material inclusions such as e-textile systems, and row-level patterning for textured fabrics.
In all, we presented a computationally augmented fabrication system which provides immediate feedback about the state and affordances of the underlying knitting machine. We see this concrete technical system, and our discussion of our domain-aware implementation decisions, as a critical step toward broadly accessibly real-time fabrication for creativity and education. We additionally hope this work can inspire the digital fabrication community to revisit the vast breadth of not-currently-computational fabrication equipment to support fabrication – whether automated, manual, or novel hybrids – in a wide variety of domains.

Acknowledgments

The authors would like to thank David Renshaw for help with debugging, image classification model training, and photography. This research was supported by National Science Foundation Grants Career IIS-2047912, IIS-2017008, and IIS-2118924.

Supplementary Material

MP4 File (3544548.3581549-video-preview.mp4)
Video Preview
MP4 File (3544548.3581549-video-figure.mp4)
Video Figure
MP4 File (3544548.3581549-talk-video.mp4)
Pre-recorded Video Presentation

References

[1]
Martín Abadi, Ashish Agarwal, Paul Barham, Eugene Brevdo, Zhifeng Chen, Craig Citro, Greg S. Corrado, Andy Davis, Jeffrey Dean, Matthieu Devin, Sanjay Ghemawat, Ian Goodfellow, Andrew Harp, Geoffrey Irving, Michael Isard, Yangqing Jia, Rafal Jozefowicz, Lukasz Kaiser, Manjunath Kudlur, Josh Levenberg, Dandelion Mané, Rajat Monga, Sherry Moore, Derek Murray, Chris Olah, Mike Schuster, Jonathon Shlens, Benoit Steiner, Ilya Sutskever, Kunal Talwar, Paul Tucker, Vincent Vanhoucke, Vijay Vasudevan, Fernanda Viégas, Oriol Vinyals, Pete Warden, Martin Wattenberg, Martin Wicke, Yuan Yu, and Xiaoqiang Zheng. 2015. TensorFlow: Large-Scale Machine Learning on Heterogeneous Systems.
[2]
Muhammad Abdullah, Romeo Sommerfeld, Laurenz Seidel, Jonas Noack, Ran Zhang, Thijs Roumen, and Patrick Baudisch. 2021. Roadkill: Nesting Laser-Cut Objects for Fast Assembly. In The 34th Annual ACM Symposium on User Interface Software and Technology(UIST ’21). Association for Computing Machinery, New York, NY, USA, 972–984. https://doi.org/10.1145/3472749.3474799
[3]
Roland Aigner, Mira Alida Haberfellner, and Michael Haller. 2022. spaceR: Knitting Ready-Made, Tactile, and Highly Responsive Spacer-Fabric Force Sensors for Continuous Input. In Proceedings of the 35th Annual ACM Symposium on User Interface Software and Technology(UIST ’22). Association for Computing Machinery, New York, NY, USA, 1–15. https://doi.org/10.1145/3526113.3545694
[4]
Lea Albaugh, Scott Hudson, and Lining Yao. 2019. Digital Fabrication of Soft Actuated Objects by Machine Knitting. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems(CHI ’19). Association for Computing Machinery, Glasgow, Scotland UK, 1–13. https://doi.org/10.1145/3290605.3300414
[5]
Lea Albaugh, Scott E. Hudson, Lining Yao, and Laura Devendorf. 2020. Investigating Underdetermination Through Interactive Computational Handweaving. In Proceedings of the 2020 ACM Designing Interactive Systems Conference(DIS ’20). Association for Computing Machinery, New York, NY, USA, 1033–1046. https://doi.org/10.1145/3357236.3395538
[6]
Lea Albaugh, James McCann, Scott E. Hudson, and Lining Yao. 2021. Engineering Multifunctional Spacer Fabrics Through Machine Knitting. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems(CHI ’21). Association for Computing Machinery, New York, NY, USA, 1–12. https://doi.org/10.1145/3411764.3445564
[7]
Lea Albaugh, James McCann, Lining Yao, and Scott E. Hudson. 2021. Enabling Personal Computational Handweaving with a Low-Cost Jacquard Loom. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems(CHI ’21). Association for Computing Machinery, New York, NY, USA, 1–10. https://doi.org/10.1145/3411764.3445750
[8]
Byoungkwon An, Ye Tao, Jianzhe Gu, Tingyu Cheng, Xiang ’Anthony’ Chen, Xiaoxiao Zhang, Wei Zhao, Youngwook Do, Shigeo Takahashi, Hsiang-Yun Wu, Teng Zhang, and Lining Yao. 2018. Thermorph: Democratizing 4D Printing of Self-Folding Materials and Interfaces. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems(CHI ’18). Association for Computing Machinery, New York, NY, USA, 1–12. https://doi.org/10.1145/3173574.3173834
[9]
Kristina Andersen, Ron Wakkary, Laura Devendorf, and Alex McLean. 2019. Digital Crafts-Machine-Ship: Creative Collaborations with Machines. Interactions 27, 1 (Dec. 2019), 30–35. https://doi.org/10.1145/3373644
[10]
Andreas Müller. 2015. AYAB - All Yarns Are Beautiful. https://ayab-knitting.com/. (accessed 2022-09-07).
[11]
Yuliya Baranovskaya, Marshall Prado, Moritz Dörstelmann, and Achim Menges. 2016. Knitflatable Architecture - Pneumatically Activated Preprogrammed Knitted Textiles. In eCAADe 2016: Complexity & Simplicity. eCAADe 2016, Oulu, Finland, 571–580. https://doi.org/10.52842/conf.ecaade.2016.1.571
[12]
Virginia Braun and Victoria Clarke. 2019. Reflecting on Reflexive Thematic Analysis. Qualitative Research in Sport, Exercise and Health 11, 4 (Aug. 2019), 589–597. https://doi.org/10.1080/2159676X.2019.1628806
[13]
Xiang ’Anthony’ Chen, Jeeeun Kim, Jennifer Mankoff, Tovi Grossman, Stelian Coros, and Scott E. Hudson. 2016. Reprise: A Design Tool for Specifying, Generating, and Customizing 3D Printable Adaptations on Everyday Objects. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology. ACM, Tokyo Japan, 29–39. https://doi.org/10.1145/2984511.2984512
[14]
Alessandrina Costa. 2008. Alessandrina.Com. https://alessandrina.com/category/machine-knitting/. (accessed 2022-02-02).
[15]
Mustafa Doga Dogan, Steven Vidal Acevedo Colon, Varnika Sinha, Kaan Akşit, and Stefanie Mueller. 2021. SensiCut: Material-Aware Laser Cutting Using Speckle Sensing and Deep Learning. In The 34th Annual ACM Symposium on User Interface Software and Technology(UIST ’21). Association for Computing Machinery, New York, NY, USA, 24–38. https://doi.org/10.1145/3472749.3474733
[16]
Elkágyé. 21 January 2018, 07:23:16. Manually Operated Flat Knitting Machine.
[17]
Jessica Forbes and Cassidy Forbes. 2015. Ravelry: AYAB Members. https://www.ravelry.com/groups/ayab/members. (accessed 2022-09-13).
[18]
Frikk Fossdal, Rogardt Heldal, and Nadya Peek. 2021. Interactive Digital Fabrication Machine Control Directly Within a CAD Environment. In Symposium on Computational Fabrication(SCF ’21). Association for Computing Machinery, New York, NY, USA, 1–15. https://doi.org/10.1145/3485114.3485120
[19]
Gerard Rubio. 2014. Building the OpenKnit Machine. https://www.instructables.com/Building-the-Open-Knit-machine/. (accessed 2022-09-13).
[20]
Gerard Rubio. 2014. OpenKnit.
[21]
Gerard Rubio. 2017. About. https://www.kniterate.com/about/. (accessed 2022-09-13).
[22]
Jordan Graves and Anne Sullivan. 2020. eLoominate: Tools for Casual Creation in Hybrid Craft. In Proceedings of the ICCC 2020 Workshops. CEUR-WS, Coimbra (PT) / Online, 5.
[23]
Jianzhe Gu, David E. Breen, Jenny Hu, Lifeng Zhu, Ye Tao, Tyson Van de Zande, Guanyun Wang, Yongjie Jessica Zhang, and Lining Yao. 2019. Geodesy: Self-Rising 2.5D Tiles by Printing along 2D Geodesic Closed Path. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. ACM, Glasgow Scotland Uk, 1–10. https://doi.org/10.1145/3290605.3300267
[24]
Susan Guagliumi. 2008. Hand-Manipulated Stitches for Machine Knitters. Taunton Press, Newtown, CT.
[25]
Nur Al-huda Hamdan, Simon Voelker, and Jan Borchers. 2018. Sketch&Stitch: Interactive Embroidery for E-Textiles. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems - CHI ’18. ACM Press, Montreal QC, Canada, 1–13. https://doi.org/10.1145/3173574.3173656
[26]
Megan Hofmann, Lea Albaugh, Ticha Sethapakadi, Jessica Hodgins, Scott E. Hudson, James McCann, and Jennifer Mankoff. 2019. KnitPicking Textures: Programming and Modifying Complex Knitted Textures for Machine and Hand Knitting. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology(UIST ’19). Association for Computing Machinery, New Orleans, LA, USA, 5–16. https://doi.org/10.1145/3332165.3347886
[27]
Scott E. Hudson. 2014. Printing Teddy Bears: A Technique for 3D Printing of Soft Interactive Objects. In Proceedings of the 32nd Annual ACM Conference on Human Factors in Computing Systems - CHI ’14. ACM Press, Toronto, Ontario, Canada, 459–468. https://doi.org/10.1145/2556288.2557338
[28]
Alexandra Ion, Johannes Frohnhofen, Ludwig Wall, Robert Kovacs, Mirela Alistar, Jack Lindsay, Pedro Lopes, Hsiang-Ting Chen, and Patrick Baudisch. 2016. Metamaterial Mechanisms. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology. ACM, Tokyo Japan, 529–539. https://doi.org/10.1145/2984511.2984540
[29]
Benjamin Jones, Yuxuan Mei, Haisen Zhao, Taylor Gotfrid, Jennifer Mankoff, and Adriana Schulz. 2022. Computational Design of Knit Templates. ACM Transactions on Graphics 41, 2 (April 2022), 1–16. https://doi.org/10.1145/3488006
[30]
Alexandre Kaspar. 2022. Garment Design Workflows for On-Demand Machine Knitting. PhD Thesis. Massachusetts Institute of Technology, Boston, MA, USA.
[31]
Alexandre Kaspar, Kui Wu, Yiyue Luo, Liane Makatura, and Wojciech Matusik. 2021. Knit Sketching: From Cut & Sew Patterns to Machine-Knit Garments. ACM Transactions on Graphics 40, 4 (July 2021), 63:1–63:15. https://doi.org/10.1145/3450626.3459752
[32]
Jeeeun Kim, Haruki Takahashi, Homei Miyashita, Michelle Annett, and Tom Yeh. 2017. Machines as Co-Designers: A Fiction on the Future of Human-Fabrication Machine Interaction. In Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems - CHI EA ’17. ACM Press, Denver, Colorado, USA, 790–805. https://doi.org/10.1145/3027063.3052763
[33]
Jeeeun Kim, Clement Zheng, Haruki Takahashi, Mark D Gross, Daniel Ashbrook, and Tom Yeh. 2018. Compositional 3D Printing: Expanding & Supporting Workflows towards Continuous Fabrication. In Proceedings of the 2nd ACM Symposium on Computational Fabrication. ACM, Cambridge Massachusetts, 1–10. https://doi.org/10.1145/3213512.3213518
[34]
Jin Hee (Heather) Kim, Kunpeng Huang, Simone White, Melissa Conroy, and Cindy Hsin-Liu Kao. 2021. KnitDermis: Fabricating Tactile On-Body Interfaces Through Machine Knitting. In Designing Interactive Systems Conference 2021(DIS ’21). Association for Computing Machinery, New York, NY, USA, 1183–1200. https://doi.org/10.1145/3461778.3462007
[35]
Jin Hee (Heather) Kim, Shreyas Dilip Patil, Sarina Matson, Melissa Conroy, and Cindy Hsin-Liu Kao. 2022. KnitSkin: Machine-Knitted Scaled Skin for Locomotion. In CHI Conference on Human Factors in Computing Systems. ACM, New Orleans LA USA, 1–15. https://doi.org/10.1145/3491102.3502142
[36]
Kris Basta. 2019. Needle Beetle Needle Selector for LK Machines. https://www.kriskrafter.com/product-page/needle-beetle-needle-selector. (accessed 2022-09-13).
[37]
Gierad Laput, Xiang ’Anthony’ Chen, and Chris Harrison. 2015. 3D Printed Hair: Fused Deposition Modeling of Soft Strands, Fibers, and Bristles. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology - UIST ’15. ACM Press, Daegu, Kyungpook, Republic of Korea, 593–597. https://doi.org/10.1145/2807442.2807484
[38]
Maria Larsson, Hironori Yoshida, and Takeo Igarashi. 2019. Human-in-the-Loop Fabrication of 3D Surfaces with Natural Tree Branches. In Proceedings of the ACM Symposium on Computational Fabrication(SCF ’19). Association for Computing Machinery, New York, NY, USA, 1–12. https://doi.org/10.1145/3328939.3329000
[39]
Yi-Chin Lee and Daniel Cardoso Llach. 2020. Hybrid Embroidery: Exploring Interactive Fabrication in Handcrafts. In ACM SIGGRAPH 2020 Art Gallery(SIGGRAPH ’20). Association for Computing Machinery, New York, NY, USA, 429–433. https://doi.org/10.1145/3386567.3388575
[40]
Zishun Liu, Xingjian Han, Yuchen Zhang, Xiangjia Chen, Yu-Kun Lai, Eugeni L. Doubrovski, Emily Whiting, and Charlie C. L. Wang. 2021. Knitting 4D Garments with Elasticity Controlled for Body Motion. ACM Transactions on Graphics 40, 4 (July 2021), 62:1–62:16. https://doi.org/10.1145/3450626.3459868
[41]
James McCann. 2017. The "Knitout" (.k) File Format v0.5.3. https://textiles-lab.github.io/knitout/knitout.html. (accessed 2018-04-03).
[42]
James McCann, Lea Albaugh, Vidya Narayanan, April Grow, Wojciech Matusik, Jennifer Mankoff, and Jessica Hodgins. 2016. A Compiler for 3D Machine Knitting. ACM Transactions on Graphics 35, 4 (July 2016), 49:1–49:11. https://doi.org/10.1145/2897824.2925940
[43]
Stefanie Mueller. 2016. Interacting with Personal Fabrication Devices. PhD Thesis. Universität Potsdam.
[44]
Stefanie Mueller, Anna Seufert, Huaishu Peng, Robert Kovacs, Kevin Reuss, François Guimbretière, and Patrick Baudisch. 2019. FormFab: Continuous Interactive Fabrication. In Proceedings of the Thirteenth International Conference on Tangible, Embedded, and Embodied Interaction. ACM, Tempe Arizona USA, 315–323. https://doi.org/10.1145/3294109.3295620
[45]
Georges Nader, Yu Han Quek, Pei Zhi Chia, Oliver Weeger, and Sai-Kit Yeung. 2021. KnitKit: A Flexible System for Machine Knitting of Customizable Textiles. ACM Transactions on Graphics 40, 4 (Aug. 2021), 1–16. https://doi.org/10.1145/3450626.3459790
[46]
Vidya Narayanan. 2022. Foundations for 3D Machine Knitting. Thesis. Carnegie Mellon University. https://doi.org/10.1184/R1/19658928.v1
[47]
Vidya Narayanan, Lea Albaugh, Jessica Hodgins, Stelian Coros, and James Mccann. 2018. Automatic Machine Knitting of 3D Meshes. ACM Transactions on Graphics 37, 3 (Aug. 2018), 1–15. https://doi.org/10.1145/3186265
[48]
Huaishu Peng, Jimmy Briggs, Cheng-Yao Wang, Kevin Guo, Joseph Kider, Stefanie Mueller, Patrick Baudisch, and François Guimbretière. 2018. RoMA: Interactive Fabrication with Augmented Reality and a Robotic 3D Printer. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems(CHI ’18). Association for Computing Machinery, Montreal QC, Canada, 1–12. https://doi.org/10.1145/3173574.3174153
[49]
Michael L. Rivera and Scott E. Hudson. 2019. Desktop Electrospinning: A Single Extruder 3D Printer for Producing Rigid Plastic and Electrospun Textiles. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems - CHI ’19. ACM Press, Glasgow, Scotland Uk, 1–12. https://doi.org/10.1145/3290605.3300434
[50]
Rundong Tian, Sarah Sterman, Ethan Chiou, Jeremy Warner, and Eric Paulos. 2018. MatchSticks: Woodworking through Improvisational Digital Fabrication. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems(CHI ’18). Association for Computing Machinery, Montreal QC, Canada, 1–12. https://doi.org/10.1145/3173574.3173723
[51]
Tenley Schmida and Paolo Pedercini. 2018. GlitchScarf.
[52]
Eldon Schoop, Michelle Nguyen, Daniel Lim, Valkyrie Savage, Sean Follmer, and Björn Hartmann. 2016. Drill Sergeant: Supporting Physical Construction Projects through an Ecosystem of Augmented Tools. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems(CHI EA ’16). Association for Computing Machinery, San Jose, California, USA, 1607–1614. https://doi.org/10.1145/2851581.2892429
[53]
Ticha Sethapakdi, Daniel Anderson, Adrian Reginald Chua Sy, and Stefanie Mueller. 2021. Fabricaide: Fabrication-Aware Design for 2D Cutting Machines. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. ACM, Yokohama Japan, 1–12. https://doi.org/10.1145/3411764.3445345
[54]
A. Smailagic and D.P. Siewiorek. 1993. A Case Study in Embedded-System Design: The VuMan 2 Wearable Computer. IEEE Design & Test of Computers 10, 3 (Sept. 1993), 56–67. https://doi.org/10.1109/54.232473
[55]
David J. Spencer. 2001. Knitting Technology: A Comprehensive Handbook and Practical Guide (3. ed ed.). Number 16 in Woodhead Publishing Series in Textiles. Woodhead [u.a.], Cambridge [u.a].
[56]
Sarah Spencer. 2021. Auto Changer. https://www.heartofpluto.co/autochanger. (accessed 2022-09-13).
[57]
Blair Subbaraman and Nadya Peek. 2022. P5.Fab: Direct Control of Digital Fabrication Machines from a Creative Coding Environment. In Designing Interactive Systems Conference(DIS ’22). Association for Computing Machinery, New York, NY, USA, 1148–1161. https://doi.org/10.1145/3532106.3533496
[58]
TensorFlow Authors. 2022. Image Classification.
[59]
Rundong Tian and Eric Paulos. 2021. Adroid: Augmenting Hands-on Making with a Collaborative Robot. In The 34th Annual ACM Symposium on User Interface Software and Technology(UIST ’21). Association for Computing Machinery, New York, NY, USA, 270–281. https://doi.org/10.1145/3472749.3474749
[60]
Rundong Tian, Vedant Saran, Mareike Kritzler, Florian Michahelles, and Eric Paulos. 2019. Turn-by-Wire: Computationally Mediated Physical Fabrication. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology(UIST ’19). Association for Computing Machinery, New Orleans, LA, USA, 713–725. https://doi.org/10.1145/3332165.3347918
[61]
Jasper Tran O’Leary and Nadya Peek. 2019. Machine-o-Matic: A Programming Environment for Prototyping Digital Fabrication Workflows. In The Adjunct Publication of the 32nd Annual ACM Symposium on User Interface Software and Technology. ACM, New Orleans LA USA, 134–136. https://doi.org/10.1145/3332167.3356897
[62]
Hannah Twigg-Smith, Jasper Tran O’Leary, and Nadya Peek. 2021. Tools, Tricks, and Hacks: Exploring Novel Digital Fabrication Workflows on #PlotterTwitter. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems(CHI ’21). Association for Computing Machinery, New York, NY, USA, 1–15. https://doi.org/10.1145/3411764.3445653
[63]
Christian Weichel, John Hardy, Jason Alexander, and Hans Gellersen. 2015. ReForm: Integrating Physical and Digital Design through Bidirectional Fabrication. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology - UIST ’15. ACM Press, Daegu, Kyungpook, Republic of Korea, 93–102. https://doi.org/10.1145/2807442.2807451
[64]
Pierre Wellner. 1993. Interacting with Paper on the DigitalDesk. Commun. ACM 36, 7 (July 1993), 87–96. https://doi.org/10.1145/159544.159630
[65]
Kristin Williams, Jessica Hammer, and Scott E. Hudson. 2021. An Upcycled IoT: Building Tomorrow’s IoT out of Today’s Household Possessions. XRDS: Crossroads, The ACM Magazine for Students 27, 4 (June 2021), 19–25. https://doi.org/10.1145/3466872
[66]
Nur Yildirim, James McCann, and John Zimmerman. 2020. Digital Fabrication Tools at Work: Probing Professionals’ Current Needs and Desired Futures. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems(CHI ’20). Association for Computing Machinery, Honolulu, HI, USA, 1–13. https://doi.org/10.1145/3313831.3376621
[67]
Amit Zoran and Joseph A. Paradiso. 2013. FreeD: A Freehand Digital Sculpting Tool. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI ’13. ACM Press, Paris, France, 2613. https://doi.org/10.1145/2470654.2481361

Cited By

View all
  • (2024)Informed by Yarns - Proposing Knitting Patterns for Revealing Digital System ActivitiesProceedings of the 13th Nordic Conference on Human-Computer Interaction10.1145/3679318.3685338(1-16)Online publication date: 13-Oct-2024
  • (2024)MediKnit: Soft Medical Making for Personalized and Clinician-Designed Wearable Devices for Hand EdemaProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785048:3(1-30)Online publication date: 9-Sep-2024
  • (2024)Knitting the Sea Slugs – a Demonstration of a Human-AI-Machine-Material AssemblageCompanion Publication of the 2024 ACM Designing Interactive Systems Conference10.1145/3656156.3665422(318-322)Online publication date: 1-Jul-2024
  • Show More Cited By

Index Terms

  1. An Augmented Knitting Machine for Operational Assistance and Guided Improvisation

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '23: Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems
    April 2023
    14911 pages
    ISBN:9781450394215
    DOI:10.1145/3544548
    This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike International 4.0 License.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 19 April 2023

    Check for updates

    Author Tags

    1. Interactive fabrication
    2. computational crafts
    3. hybrid fabrication
    4. soft materials.

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Funding Sources

    Conference

    CHI '23
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Upcoming Conference

    CHI '25
    CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)813
    • Downloads (Last 6 weeks)139
    Reflects downloads up to 12 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Informed by Yarns - Proposing Knitting Patterns for Revealing Digital System ActivitiesProceedings of the 13th Nordic Conference on Human-Computer Interaction10.1145/3679318.3685338(1-16)Online publication date: 13-Oct-2024
    • (2024)MediKnit: Soft Medical Making for Personalized and Clinician-Designed Wearable Devices for Hand EdemaProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785048:3(1-30)Online publication date: 9-Sep-2024
    • (2024)Knitting the Sea Slugs – a Demonstration of a Human-AI-Machine-Material AssemblageCompanion Publication of the 2024 ACM Designing Interactive Systems Conference10.1145/3656156.3665422(318-322)Online publication date: 1-Jul-2024
    • (2024)Embrogami: Shape-Changing Textiles with Machine EmbroideryProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676431(1-15)Online publication date: 13-Oct-2024
    • (2024)KnitScape: Computational Design and Yarn-Level Simulation of Slip and Tuck Colorwork Knitting PatternsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642799(1-20)Online publication date: 11-May-2024
    • (2024)Throwing Out Conventions: Reimagining Craft-Centered CNC Tool Design through the Digital Pottery WheelProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642361(1-22)Online publication date: 11-May-2024

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Get Access

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media