Abstract

Computer-aided design (CAD) is a standard design tool used in engineering practice and by students. CAD has become increasingly analytic and inventive in incorporating artificial intelligence (AI) approaches to design, e.g., generative design (GD), to help expand designers' divergent thinking. However, generative design technologies are relatively new, we know little about generative design thinking in students. This research aims to advance our understanding of the relationship between aspects of generative design thinking and traditional design thinking. This study was set in an introductory graphics and design course where student designers used Fusion 360 to optimize a bicycle wheel frame. We collected the following data from the sample: divergent and convergent psychological tests and an open-ended response to a generative design prompt (called the generative design reasoning elicitation problem). A Spearman's rank correlation showed no statistically significant relationship between generative design reasoning and divergent thinking. However, an analysis of variance found a significant difference in generative design reasoning and convergent thinking between groups with moderate GD reasoning and low GD reasoning. This study shows that new computational tools might present the same challenges to beginning designers as conventional tools. Instructors should be aware of informed design practices and encourage students to grow into informed designers by introducing them to new technology, such as generative design.

1 Introduction

Design is essential to engineering education and practice [1,2], but it is hard to learn and arguably harder to teach [3]. Computer-aided design (CAD) is a standard design tool used in engineering practice and by students [4], and advanced design techniques are now regularly available in CAD programs. One such design technique, generative design (GD), is an iterative design exploration process “that leverages the power of computationally driven artificial intelligence (AI) to automatically explore a wide design space in order to identify the best design options” [5]. While recent commercial successes tout the promise of generative design in engineering design [6], because it is such a recent technology, little research has been conducted surrounding how students and designers learn to interact with it.

Advanced AI computational technologies have the potential to greatly impact the design process [7] and impact design cognition models and theories [8]. Recent research suggests that both design process and design behavior are affected when designers use generative design tools in early-stage design [9]. We have argued that the paradigm shift in design methods due to the increased use of generative design technologies will change the design thinking mindset and must be addressed in the engineering education of the future workforce [10].

Due to this need to adapt to a future approach to engineering education, our overall goal is to understand the relationship between generative design thinking and traditional design thinking. In particular, this study investigates the relationship between one aspect of generative design thinking, generative design reasoning, and two core competencies of traditional design thinking, divergent and convergent thinking. While divergent and convergent thinking are only two core competencies of generative design, and while generative design thinking likely encompasses more than reasoning, our current study seeks to make contributions to understanding these features within a real-engineering classroom in order to set the stage for future work. We employed a multi-method approach [11] to answer the research question: What is the relationship between students' generative design reasoning and their divergent and convergent thinking?

We define generative design reasoning as a cognitive concept where a designer uses problem-solving to justify their thinking and decision-making in a design activity. We elaborate on divergent and convergent thinking in Sec. 2.3.

Because generative design tools produce an expanded outcome space, we, as a team of design educators who incorporate generative design in our teaching and research, might expect proficient designers to exhibit a high level of convergent thinking as the designer must evaluate and converge on a solution. Additionally, we expect that generative design might help students' ideation as the designer collaborates with advanced tools to create innovative and high-performing products. Simply engaging students in the generative design process does not necessarily ensure that students will develop generative design reasoning. The study logic model is depicted in Fig. 1.

Fig. 1
Study logic model

2 Background and Motivation

2.1 Computer-Aided Design in Undergraduate Curricula and Design Cognition.

Design is included in all engineering curricula, and the ability to apply engineering design is a required outcome for a graduate of an ABET-certified engineering program, regardless of discipline in the United States (e.g., aerospace, agricultural, biomedical, chemical, civil, computer, electrical, environmental, industrial, materials, mechanical, nuclear, systems) [12] and is included in the European Network for Accreditation of Engineering Education (ENAEE) standards [13]. CAD is a technology used by design professionals to create digital artifacts with the functionality of 2D drafting, solid modeling, and surface modeling [5]. Ivan Sutherland's groundbreaking sketchpad in 1962 was the earliest 3D representation program [14], but Autodesk's 1982 launch of autocad®, the first commercially successful CAD program, helped establish CAD as an essential tool for engineers and other design professionals [5]. Traditional CAD programs are installed on local, single computers. However, in recent years, cloud-based CAD has gained popularity with cloud storage, collaboration capacities, and greater computing power [5].

Engineering students are often taught to communicate their design ideas through CAD and hand sketching [5,15]. CAD is a useful tool in engineering education because, compared to hand sketching, it allows a designer to show their design prototype from multiple viewpoints and move their design artifact dynamically. A fully constrained CAD model allows the designer to communicate details with others and use the model for powerful simulations such as finite element analysis, generative design, and motion analysis [5]. The benefits of teaching CAD at the undergraduate level include improving spatial visualization skills [16], developing communication and problem-solving skills [17], exposing students to design-based teamwork [18], and allowing better visualization of initial designs [19]. Further, CAD assists in the creative process by enhancing the ability of a team to visualize and communicate their ideas [18] and can promote the development of design thinking strategies [20]. Through participating in group projects that involve CAD, students improve their design and communication skills [3,21,22].

Despite the utility of CAD, in an educational setting, CAD might impact students' creative cognitive processes [23], and CAD might limit beginning designers' ability to explore more creative solutions [19,24,25]. Sreekanth and Viswanathan [19] observed that the presence of an example prevented full creative freedom because it led to designers fixating on specific features in that example. Other studies discuss the unintended consequence of CAD use, such as how CAD might contribute to circumscribed thinking due to limitations of what can be created in a CAD system [23], premature fixation due to large amounts of detail being added to a drawing in early stages [23], and a focus on CAD technology inhibiting full incorporation of requirements (i.e., user perspectives in human-centered design) [26]. Therefore, CAD is important in engineering design education, but educators must use appropriate pedagogical approaches to facilitate students' growth in successful design behaviors [1,23] and design thinking habits [20,27].

2.2 Generative Design.

CAD programs are becoming increasingly analytic and inventive. For example, one advanced simulation technique is generative design, an iterative design exploration process that utilizes user-input parameters and leverages computationally driven AI to generate a large pool of design options [28]. Generative design software programs take specified product criteria and constraints to begin an evolutionary computation process that efficiently explores the entire parameter space supported by the software to find optimal solutions through multiple computational iterations [10]. Designers review the outputs of a generative design solution space to compare the solutions across multiple constraints (e.g., mechanical, manufacturing, spatial, and cost) [10]. Generative design software has numerous applications in aerospace, manufacturing, and consumer goods. For example, Airbus, a commercial aircraft manufacturer, leveraged Autodesk Fusion 360's generative design program to reduce the weight of their planes, in turn, reducing the amount of fuel required, therefore saving cost, and leaving a smaller carbon footprint [28]. They used generative design to engineer a 45% lighter chair, requiring 95% less raw material to manufacture. The design requirements included reducing the weight of a component, structurally supporting the passenger, and attaching it to the plane in four locations. These design modifications saved Airbus an estimated amount of 3180 kg of fuel per partition, per year [28].

Generative design originates from topology optimization (TO), which “helps engineers place material within a prescribed design domain to obtain the best structural performance” [29]. Generative design extends the function of TO further and allows designers to control more objective variables (e.g., manufacturing method); it is useful when the designer does not know the shape of the final part and wants the computer to account for variables such as functionality, manufacturing method, cost, and structural integrity. Both TO and generative design can be used to optimize load-bearing parts, but TO only provides a single optimized mesh model, whereas generative design leverages AI to explore a wider range of solutions [30]. In generating numerous solutions to a design question, the software allows designers to explore solutions that are too vast for humans to explore manually [31]. Therefore, using AI-assisted technology might allow designers to iterate more efficiently and engage in more divergent design thinking.

While generative design software helps designers come up with an array of design possibilities, designers then weigh design criteria and converge on a final design. Subjective variables such as aesthetic requirements are difficult to automate and require the designer to weigh competing designs [6]. Generative design requires collaboration from both a human designer and algorithmic computation [9]. Digital technologies have influenced design thinking and created a new role for designers and their integration with digital design media.

2.3 Design Cognition Via Convergent and Divergent Thinking.

Design cognition combines the fields of cognitive science and applied psychology to study the mental processes and representations involved in design [8,32]. Design educator and researcher Nigel Cross led some of the earliest research on design as a discipline [33]. Cross has further suggested that the three main areas of design cognition are formulating problems, generating solutions, and utilizing design process strategies [33].

Two notable elements of design cognition are divergent and convergent thinking, both coined by psychologist, Guilford in 1956 [34]. Divergent thinking in creativity is generally defined as the ability to generate ideas that are both novel and useful [35]. Researchers generally differentiate between fluency/frequency (i.e., the number of ideas generated) and originality/novelty (of the generated ideas) of divergent thinking [36]. These concepts are closely related to the (respectively) Quantity and Novelty measures of engineering ideation effectiveness introduced by Shah et al. [37]. In design contexts, divergent thinking focuses on generating potential solutions to a problem. Designers engage in divergent thinking during the early stages of the design process which require idea generation [38]. In contrast, convergent thinking generally focuses on quickly and accurately deriving a single solution to a problem [39]. Compared to divergent thinking skills, convergent skills are often better represented in engineering courses, but most designs require one to both diverge and converge to a final solution to a problem [40]. Although studies in design creativity have traditionally focused on divergent thinking, designers frequently shift between divergent and convergent thinking as “a hallmark of creative thinking” [41]. Therefore, studies in design cognition should consider assessing both types of thinking. For the current study, both divergent and convergent thinking are key components of design thinking required in engineering design and will represent core design thinking competencies, although we recognize that they are only two components that were relevant in this particular real, classroom setting study.

3 Conceptual Framework

Crismond and Adams' Informed Design Learning and Teaching Matrix [1], referred to hereafter as “the Matrix,” is a meta-analysis of over 50 studies of design behavior that contrasts beginning with informed designers. They define beginning designers as those with little or no experience in design and informed designers as those with some design experience and competence between that of a novice and expert designer. There are nine design strategies outlined in the Matrix, but two patterns are of interest in this study: generating ideas and making design decisions. In addition to contrasting patterns in beginning and informed design behavior, the Matrix also suggests learning goals and instructional strategies to help move student designers from beginning to informed behaviors. We refer back to these strategies for teaching implications related to generative design.

When beginning designers generate their ideas, they tend only to devise a few concepts and fixate on them compared to informed designers, who generate multiple concepts using divergent thinking [42]. Beginning designers often begin their work with very few or only one idea that they are unwilling to discard or revise. In contrast, informed designers come up with many ideas, use divergent thinking to explore more designs, and do not favor a single solution [1].

Weighing options or making design decisions is another area where beginning and informed designers differ. Beginning designers tend to pay little attention to design criteria and constraints and focus on the bigger picture of the positives and negatives of the design without considering the associated benefits and tradeoffs [1]. Informed designers balance benefits and tradeoffs when they make design decisions as well as justify them. Beginning designers often cite the benefits of their preferred design choices while neglecting to mention associated tradeoffs; they also highlight the negative aspects of less favored approaches while passing over potential benefits [1].

We use Crismond and Adams [1] as a conceptual framework to understand two important design behaviors represented in the Matrix (generating ideas and making design decisions) within a generative design paradigm and to discuss results in the context of informed design practices. Generating ideas is of interest because of the role of generative design in creating a multitude of possible design solutions, and our goal of understanding any relationship between the ability of humans to create many solutions and their ability to reason through AI-generated solutions. Making design decisions is of interest in understanding if making design decisions in traditional design is related to reasoning through generative design solutions.

4 Methods

4.1 Participants and Course Context.

This pilot study was conducted in a large, introductory design and graphics course at a large, public Midwest University with 94 students (29% women and 71% men) during the spring 2022 semester. Students were not expected to have any prior CAD experience before taking this course.

The course includes a whole-class lecture and a smaller laboratory section where students complete CAD modules and work on their semester-long team project. The students at this university have access to Fusion 360 and can perform generative design at no additional cost. Toward the end of the semester, the students completed a generative design module that asked them to optimize a bike frame/wheel design by comparing computer-generated designs and selecting one to analyze further. We received approval from the university's institutional review board (IRB) to analyze data collected from consenting students in the course.

4.2 Data Collected.

Data sources collected over the course of the semester included: (1) a divergent thinking test, (2) a convergent thinking test, and (3) a generative design reasoning elicitation problem. Figure 2 shows the number of students who completed each assignment. The divergent thinking test was also administered to 82 students in a separate first-year design course that focuses on building information design (BIM course) for comparison.

Fig. 2
Student participation by assignment
Fig. 2
Student participation by assignment
Close modal

4.2.1 Divergent Thinking Test.

The alternative uses test (AUT) [43] was used to measure divergent thinking. The test was administered during the lecture and completed on participants' electronic devices via a Qualtrics survey. Students were asked to read the following instructions:

For this task, you'll be asked to come up with as many original and creative uses for a BOX as you can. The goal is to come up with creative ideas, which are ideas that strike people as clever, unusual, interesting, uncommon, humorous, innovative, or different. Your ideas don’t have to be practical or realistic; they can be silly or strange, even, so long as they are CREATIVE uses rather than ordinary uses.

Participants were given 3 min to respond, and asked to generate as many ideas as they could. Example responses could include: “step stool” or “doghouse.”

4.2.2 Convergent Thinking Test.

The VisAha! convergent thinking measure was developed by one of the authors by modifying the materials and procedure used by Ludmer et al. [44], in which participants were first shown black and white “camouflage” images of commonly recognized objects (e.g., sunglasses, shovel, fishing pole; Fig. 3), and then briefly shown the “real” image to induce a feeling of insight (an “Aha!” moment). Participants in the present study were shown 60 camouflaged images for up to 12 s each and asked to identify the object by typing their response. If a participant responded to an image, they were asked to identify if they solved the trial via insight (“… sudden and surprising, as if the answer just comes to you while viewing the image”), analysis (“… more deliberate, think of potential objects and try to find them in the camouflaged images”), or instantly (“The camouflaged image is not a puzzle for you, you see the object immediately”). Each trial consisted of one image and ended once the participant either responded before the 12 s had expired or was unable to identify the object within the time limit. The test was completed in a classroom setting, in one session that was presented as an extra credit opportunity. A sample of responses was collected prior to the present study which showed a normal distribution. While VisAha! was originally used with a fMRI brain scan to understand brain mechanisms, we used this test to understand convergent thinking in design as these responses (i.e., insight, analysis, instantaneous) shared similarities to how students approached generative design decisions in previous studies [45].

Fig. 3
Convergent thinking test images (the above image is presented to students and the bottom image is the answer)
Fig. 3
Convergent thinking test images (the above image is presented to students and the bottom image is the answer)
Close modal

4.2.3 Generative Design Module.

Throughout the semester, the students completed nine individual modeling assignments and simultaneously worked on a team design project. In the final modeling assignment (generative design module), students used Fusion 360's generative design environment, learning a workflow to create a generative design with given design constraints and load conditions. Fusion 360 generative design environment requires the designer to: (1) identify and create critical structures of the component (i.e., the preserved geometry and the obstacle geometry), (2) minimize the mass or maximize the stiffness of the structure, (3) specify a material, and (4) choose a method of manufacturing (e.g., two-axis cutting, additive). With the data input, the software can generate a set of outcomes to explore. These outcomes can be sorted by criteria (e.g., mass, minimum factor of safety, volume, estimated cost), and the model of any outcome can be exported for further iteration. In general, the assignment suggests that generative design tools can be incorporated in the early stages of product design to help facilitate design alternatives and often require thoughtful iterations.

In the assignment, students were asked to optimize a bike frame using shape optimization and a wheel frame using generative design (see Fig. 4). To begin using the generative design environment to optimize the bike's fork, students applied constraints (areas in which the design is fixed) and loads (forces exerted on the bike). After applying the design criteria (design objectives and manufacturing options), the students generated a set of outcomes to explore through generative design. In the generative design environment, the computer outputs a matrix of design solutions and allows the designer to use checkboxes to filter through designs (Fig. 5). Using the output matrix, they compared criteria such as mass, volume, safety factor, and material to determine what they considered to be the best design alternative. Students submitted a screenshot of their chosen design, along with the response to the three questions outlined in the Generative design reasoning elicitation problem: “Why did you choose this as the optimum generated design? What features made it stand out? What factors played into your decision?” The generative design reasoning elicitation problem was developed to investigate students' design approach to an open-ended scenario as they explained their reasoning in selecting a design solution from a multiple solutions outcome space. The students initiated the generative design module during their 2 h lab period and completed it at home. The generative design module was designed by an expert designer in the field.

Fig. 4
Generative design module—optimize wheel frame using generative design
Fig. 4
Generative design module—optimize wheel frame using generative design
Close modal
Fig. 5
Generative design module possible outcomes
Fig. 5
Generative design module possible outcomes
Close modal

There were 61 complete submissions from the 94 students enrolled in the course. Many students opted out of this assignment since it was the last assignment of the semester, and they were allowed to drop one modeling assignment.

5 Data Analysis

We employed a multi-method approach [11] in the quantitative analysis of the divergent and convergent thinking tests and qualitative analysis of the generative design reasoning elicitation problem. These preliminary analyses are reviewed together to explore the relationships between design thinking (i.e., convergent and divergent thinking) and generative design reasoning.

5.1 Measure Analysis and Scoring.

Once the semester was completed, answers to the generative design reasoning elicitation problem, divergent thinking tests, and convergent thinking tests were compiled, and each student was given an identifier to maintain anonymity.

5.1.1 Divergent Thinking Test (AUT).

Responses to the AUT were scored across two concepts: fluency/frequency (i.e., number of ideas generated) and originality/novelty (i.e., average semantic distance of responses). Semantic distance (or relatedness) quantifies the novelty of a word or phrase in comparison to a larger sample of words or phrases [46]. The SemDis software [46] calculates the semantic distance of each AUT response in comparison to five unique semantic spaces. The distance of a response compared to the five spaces is then averaged for a single semantic distance metric, ranging between 0 and 2. A higher score indicates a more original response, and a higher average semantic distance score suggests a higher ability in originality of divergent thinking ability, i.e., more original/unique idea generation. Because divergent thinking is an essential aspect of creative cognition [41], we wanted to confirm that our sample of students was similarly “creative” as compared to a simulation population (e.g., first-year engineering students at an R1 institution). We performed a t-test to determine whether there is a statistically significant difference between the divergent thinking (AUT) of two sets of first-year design students, our sample of students who use Fusion 360 for product design and a second set of students who use Revit for Building Information Modeling.

5.1.2 Convergent Thinking Test (VisAha!).

Each VisAha! submission was scored manually for correctness; every response was given a binary value. Participants were given the benefit of the doubt with spelling and grammar and marked correctly if they accurately identified the degraded image. If “fruit” was submitted and the correct answer was “strawberry,” they were marked incorrect, whereas if they answered “pawn” and the answer was “chess piece,” their response was deemed correct. Each student has an overall score, and scores on identification via insight, analysis, and instant. The responses of BIM course students were scored as well.

5.1.3 Generative Design Reasoning Level.

We performed a content analysis to categorize students' answers systematically to the open-ended question which asked them to explain their decision and reasoning for selecting the design among the generative design outcome spaces. Using the foundation of our conceptual framework in the informed design teaching and learning matrix, we coded the level of understanding (low, moderate, and sophisticated) with which students described the tradeoffs among design criteria performance (see Table 1 for codebook). Two of the authors developed the coding scheme. Then one of the authors and a second researcher coded the open-ended responses. Inter-rater reliability averaged 90.2% agreement and discrepancies were discussed to reach full consensus. Next, we performed a second content analysis to categorize design elements that students referenced in their rationale.

Table 1

Coding scheme for students' responses to the generative design reasoning elicitation problem

CodesDefinitionExample
LowThe participant stated the design elements they considered but didn’t weigh them or elaborate on their reasoning. They stated aesthetic reasons for choosing a design but didn’t use tangible things to weigh options.“The smaller volume was the reason I chose that design. I think the wheels and the design made it stand out as well as the cylinder at the top. The design I choose had a lower safety factor as well” (SP22B_075).
ModerateThe participant weighed distinctive design elements and explained some rationale.“I decided to choose this design as my optimal generated design because I felt that it was able to significantly decrease the volume. One reason I chose this version is because of its aesthetics of it; I specifically liked the web design of the left side of the object. However, one drawback of this generative design is that it significantly decreased the safety factor which may be a bad sign. On the other hand, it also decreased the volume which indicates that it uses less material and will be cheaper to manufacture” (SP22B_035).
SophisticatedThe participant weighed design elements using context and considered the design as well as connected the study to reality.I chose this mainly the fact that it had one of the highest factors of safety measurements. In comparison to the other designs, it was a little heavier and contained a higher volume, but it had a high amount of stress that could be endured before failure. This is important when designing this part of the bike because this part of the bike may undergo a high amount of stress and it is important to keep the rider safe. That is why I chose this specific design. (SP22B_012).
CodesDefinitionExample
LowThe participant stated the design elements they considered but didn’t weigh them or elaborate on their reasoning. They stated aesthetic reasons for choosing a design but didn’t use tangible things to weigh options.“The smaller volume was the reason I chose that design. I think the wheels and the design made it stand out as well as the cylinder at the top. The design I choose had a lower safety factor as well” (SP22B_075).
ModerateThe participant weighed distinctive design elements and explained some rationale.“I decided to choose this design as my optimal generated design because I felt that it was able to significantly decrease the volume. One reason I chose this version is because of its aesthetics of it; I specifically liked the web design of the left side of the object. However, one drawback of this generative design is that it significantly decreased the safety factor which may be a bad sign. On the other hand, it also decreased the volume which indicates that it uses less material and will be cheaper to manufacture” (SP22B_035).
SophisticatedThe participant weighed design elements using context and considered the design as well as connected the study to reality.I chose this mainly the fact that it had one of the highest factors of safety measurements. In comparison to the other designs, it was a little heavier and contained a higher volume, but it had a high amount of stress that could be endured before failure. This is important when designing this part of the bike because this part of the bike may undergo a high amount of stress and it is important to keep the rider safe. That is why I chose this specific design. (SP22B_012).

5.2 Statistical Analysis.

To test the relationship between students' generative design reasoning and their divergent thinking, we performed a Spearman's rank correlation. To test the relationship between students' generative design reasoning and their convergent thinking, we performed a one-way between-subjects analysis of variance (ANOVA) to test for differences in responses to the VisAha! test of convergent thinking between designers with low, moderate, and sophisticated generative design reasoning levels. Specifically, four ANOVA tests were conducted based on the four accuracy metrics gathered by the VisAha! test: total correct, total correct answered via insight, total correct answered via analysis, and total correct answered instantly. All analyses were conducted using spss version 28.0.1.0.

6 Results

Results of this study are presented first for quantitative analysis of the divergent thinking and convergent thinking tests and the qualitative analysis of open-ended responses to the generative design reasoning elicitation problem. The second section provides the statistical results testing the relationship between generative design reasoning and divergent and convergent thinking.

6.1 Intermediate Analysis Results

6.1.1 Divergent Thinking Test (AUT).

The results of the AUT indicated that students answered between 1 and 28 unique answers to the prompt, with a mean frequency (i.e., number of responses) of 5.6, as shown in Table 2. The semantic distance is normally distributed, while the frequency is not. No significant difference was found between the average semantic distance of AUT responses from the participants in our study (M = 0.93; SD = 0.07) and the students from the BIM course (M = 0.95; SD = 0.07), t(147) = −1.781, p = 0.077 (Table 3).

Table 2

Descriptive statistics for the AUT

MSDMin.Max.
AUT: Frequency5.65.2128
AUT: Semantic distance0.940.070.761.09
MSDMin.Max.
AUT: Frequency5.65.2128
AUT: Semantic distance0.940.070.761.09
Table 3

AUT semantic distance descriptive statistics by course

MSDMin.Max.
BIM course (n = 82)0.9310.07011.085
Study participants (n = 65)0.9520.06711.093
MSDMin.Max.
BIM course (n = 82)0.9310.07011.085
Study participants (n = 65)0.9520.06711.093

6.1.2 Convergent Thinking Test (VisAha!).

The results of VisAha! indicated that, on average, students got 20.8 correct responses out of 60, with a maximum score of 46. More students cited that they made their decision instantly compared to through insight or analysis.

6.1.3 Generative Design Reasoning Elicitation Problem.

The results of the generative design reasoning elicitation problem indicate that there is a relatively even distribution of level of generative design reasoning among the class, with 26.2% demonstrating low, 39.3% moderate, and 34.4% sophisticated level of understanding. Students in all three generative design rational group levels cite reasons such as volume/mass, the factor of safety, appearance, cost, material, manufacturing process, stress analysis, and reproducibility/functionality (see Table 4). Students tended to cite reasons such as volume, safety factor, and appearance, which were terms used in the assignment, to reasons such as more abstract concepts, like the Von Mises criterion, or less technical concepts like product functionality. As students could cite as many reasons for their chosen design as they wished, these categories are not mutually exclusive.

Table 4

Design criteria referenced in reasoning counts (n = 61)

n
Volume/Mass41
Safety factor41
Appearance35
Cost16
Material9
Von Mises5
Manufacturing process5
Reproducibility/Functionality5
n
Volume/Mass41
Safety factor41
Appearance35
Cost16
Material9
Von Mises5
Manufacturing process5
Reproducibility/Functionality5

Using our conceptual framework, we also reviewed how students discussed the pros and cons of their chosen design. Table 5 summarizes how students weighed the design criteria: 43% only cited the positive(s) of their chosen best design and 52% compared a positive element(s) of their chosen design to the disadvantage(s) of another design. Only three students weighed the advantages of their final design with the disadvantages of that same, suggesting very few students behaved as beginning designers as they made their design decisions.

Table 5

Number of students by method of weighing pros and cons of the chosen design

Listed positives of the chosen designWeighed the positives of the chosen design and the negatives of another designWeighed the positives and negatives of their final design
Count26323
Percentage of total42.6%52.5%4.9%
Listed positives of the chosen designWeighed the positives of the chosen design and the negatives of another designWeighed the positives and negatives of their final design
Count26323
Percentage of total42.6%52.5%4.9%

6.2 Statistical Results.

A Spearman's rank correlation was computed to assess the relationship between the average creativity of the divergent thinking test (SemDis average) and generative design reasoning. The correlation coefficient, ρ, ranges from −1 to +1. There was no statistically significant correlation found between the two variables, Spearman's ρ (46) = −0.062, n = 48, p = 0.676.

A one-way between subjects ANOVA test was run to test for differences in responses to the VisAha! test of convergent thinking between designers with low, moderate, and sophisticated generative design reasoning levels. Specifically, four ANOVA tests were conducted based on the four accuracy metrics gathered by the VisAha! test: total correct, total correct answered via insight, total correct answered via analysis, and total correct answered instantly. See Table 6 for the means and standard errors of each group for these metrics.

Table 6

Means and standard error of VishAha! by generative design reasoning level

Generative design reasoning levelnTotal correct mean (SE)Total correct; insight M (SE)Total correct; analysis M (SE)Total correct; instant M (SE)
Low814.63 (3.22)2.38 (1.17)4.25 (1.13)8 (1.76)
Moderate1324.46 (1.98)2.87 (0.89)6.46 (0.57)15.15 (1.53)
Sophisticated1121.55 (1.69)3.45 (1.02)5.18 (0.98)12.91 (1.31)
Generative design reasoning levelnTotal correct mean (SE)Total correct; insight M (SE)Total correct; analysis M (SE)Total correct; instant M (SE)
Low814.63 (3.22)2.38 (1.17)4.25 (1.13)8 (1.76)
Moderate1324.46 (1.98)2.87 (0.89)6.46 (0.57)15.15 (1.53)
Sophisticated1121.55 (1.69)3.45 (1.02)5.18 (0.98)12.91 (1.31)

An ANOVA found a significant difference in the total number of VisAha! correct answers (using any method) between designers of low, moderate, and sophisticated generative design reasoning levels, F(2,29) = 4.17, p = 0.03, η2 = 0.22. Tukey's post-hoc analyses showed that a significant difference was found between the groups with moderate (M = 24.46, SE = 1.98) and low (M = 14.63, SE = 3.22) generative design reasoning, p = 0.02.

A second ANOVA found a significant difference in the number of VisAha! correct answers (instant) between designers of low, moderate, and sophisticated generative design reasoning levels, F(2,29) = 4.62, p = 0.02, η2 = 0.24. Tukey's post-hoc analyses again showed that a significant difference was found between the groups with moderate (M = 15.15, SE = 1.53) and low (M = 8, SE = 1.76) generative design reasoning, p = 0.01.

No statistically significant differences were found in the number of VisAha! correct answers (insight) between designers of low, moderate, and sophisticated generative design reasoning levels, F(2,29) = 0.23, p = 0.79, η2 = 0.01. Additionally, no significant differences were found in the number of VisAha! correct answers (analysis) between designers of low, moderate, and sophisticated generative design reasoning levels, F(2,29) = 1.47, p = 0.24, η2 = 0.9.

7 Discussion

Educators have identified challenges in learning and teaching design thinking skills [3,47]. This study contributes to understanding any relationship between divergent and convergent thinking and generative design reasoning. The ability to practice idea fluency and make tradeoffs are essential parts of engineering design and exist among a continuum between beginning and experienced designers [1]. Researchers have stressed the need to understand how design tools, such as CAD, impact ideation, and idea evaluation [48], and more recently, others have noted that generative design tools may impact the overall design process [9]. Because generative design explores a wide design space in order to identify design options [5], we expect that generative design might help students' ideation as the designer collaborates with advanced tools to create innovative and high-performing products. Further, we expect that the ability to make tradeoffs becomes even more critical when using generative design tools that produce many design alternatives. The generative design module in this study aims to scaffold the use of generative design tools in a familiar context and allow students to reflect on their design strategies. In analyzing students' approaches to their generative design reasoning in conjunction with results from psychological tests measuring divergent and convergent thinking, we can better describe engineering student design thinking in this emerging field of generative design and propose educational practices to encourage students to grow into informed designers.

7.1 Informed Design and Generative Design Tradeoffs.

Results from analysis of the generative design reasoning elicitation problem suggest that students act as beginning designers [1] when using generative design tools. Students tended only to cite positive design elements of the solution they selected (42.6%) or positives of their chosen solution paired with negatives of another solution (52.5%) in their design reasoning when weighing design criteria in the generative design environment. While this behavior is consistent with the way beginning designers make design decisions [1], it is concerning because when using generative design tools, designers need to develop their ability to make tradeoffs in a large solution space. In addition, students are apt to make important science connections when they articulate tradeoffs between different solutions [49], which are important from a teaching and learning perspective. Therefore, engineering educators must consider how to develop generative design reasoning, further described in this section.

In addition, students tended to cite reasoning with arguably easier understood quantitative design criteria such as volume and factor of safety and were less likely to cite more advanced quantitative concepts such as Von Mises stress or holist design concepts such as overall functionality. Students tended to cite reasoning using language from the design module and within the particular generative design tool the students were using, which might have limited the concepts they discussed. Previous research suggests that engineering students grow in their ability to support design decisions as they advance through their programs. For example, research from McKenna and team found that students build design process knowledge in order to understand design solutions [50], and that capstone students were more likely to use computational and analytical evidence as support for their design decisions than first-year students [50].

Engineering education implications: Results from this study, in conjunction with teaching strategies from Crismond and Adams' Matrix, suggest that asking students to justify their design decision is beneficial to students and should be further scaffolded by regularly asking them about both the positives and negatives of designs, not just in generative design scenarios. In addition, the Matrix recommends prompting students with questions that will connect the rationality of design with their emotions so that they can articulate their values, such as in human-centered design. Other studies have concluded that many beginning designers demonstrate a technology-centered understanding of human-centered design, meaning they described the technical elements of the design (e.g., volume/mass and safety factor), and few beginning designers demonstrate human-centered design behavior by using the context of the product to consider how it will be used [51]. In our example, such questions might ask, “Which design would be easier for a human to use?” allowing students to consider and articulate a more comprehensive set of criteria in their generative design reasoning.

7.2 Traditional Design Thinking Might Not Indicate Generative Design Thinking.

Using the results from Spearman's rank correlation, we found no statistically significant relationship between students' divergent thinking and level of generative design. In other words, those who are creative (divergent) have no statistically significant relationship with how well they reason through generative design solutions. This finding might not be surprising, as the design competency of reasoning through alternatives might not have much in common with the competency of idea fluency. Moreover, recent research has pointed to generative design as constraint-driven design, leading to a different type of design process creativity compared to traditional design [9].

Using a one-way between subjects ANOVA, we found a statistically significant difference between students' convergent thinking and level of generative design reasoning between groups with moderate and low generative design reasoning. While the group of students who exhibited a sophisticated level of generative design reasoning scored higher than the low reasoning group, the difference was not enough to be statistically significant. The statistically significant finding might not be surprising, as we would expect student designers who proficiently converge on one solution in a generative design space also to exhibit traditional convergent thinking. In traditional design, beginning designers “can be oblivious” to the intricacies of design decisions [1], while informed designers “are practiced at weighing and articulating” solutions [1]. Recent research suggests that designers evaluate generated design tool-generated outcomes in many ways, such as visually and analytically, before further iterating on a particular solution [9]. Further, experienced designers use criteria beyond performance objectives in their design decision, considering many factors “when selecting from the results created by the generative design tool” [9]. Therefore, it is an important engineering practice and generative design practice to evaluate and reason through design solution alternatives. Results suggest that students who exhibit at least a moderate level of generative design reasoning might also exhibit more traditional convergent thinking abilities.

Engineering education implications: Results from this study, in conjunction with teaching strategies from Crismond and Adams' Matrix, suggest that in addition to the recommendations from Sec. 7.1., (i.e., asking for students to justify their design decision, prompting students with questions that will connect the rationality of design with their emotions so that they can articulate their values) educators take into account the (potential lack of) relationship between divergent and convergent thinking and generative design rationale in beginning designers. We still have much to learn about how students develop generative design competencies and if these competencies are related to any other traditional design competencies.

8 Limitations and Future Research

While this study makes progress in studying generative design thinking in an undergraduate engineering context, it has limitations. In particular, this study is set at one institution in one course with students who have self-selected into the study based on attendance, willingness to complete an extra assignment, and commitment at the end of the semester. Future research could expand this study and assignment to a broader population to better understand overall variation, and to substantially increase the sample size. Our divergent thinking test measure compares students to peers in the same dataset rather than to a global standard. However, we have attempted to show that the students in this study do not significantly differ from the second set of students in a class with a very different design context. Future work could assess divergent thinking in a way that compares students with other designers of varying amounts of experience. While our convergent thinking measure (VisAha!) has not been validated, we conducted preliminary analyses with an unrelated sample check for a normal distribution in participant responses. Future analyses should be conducted to correlate the VisAha! to other measures of convergent thinking. VisAha! is relatively time-intensive. As such, we had a small sample (n = 32) which limits generalizability and calls for future work with a larger sample. Finally, this study used Autodesk's Fusion 360 as the design platform. While Fusion 360 is one of the industry leaders in the generative design space, the usability of generative design software could impact the user's understanding and comprehension of generative design.

9 Conclusions and Teaching Implications

While engineering design has been a part of the curriculum for decades, research into how beginning designers approach AI-driven CAD, such as generative design is still new. In this study, we analyzed the relationship of students' generative design reasoning with their performance on two tests measuring divergent and convergent thinking to understand any relationships. While we did not find a statistically significant relationship between generative design reasoning and divergent thinking, we did find a statistically significant relationship between generative design reasoning and convergent thinking. One of the key contributions of this paper is to advance how we think about the relationship between traditional design behaviors with those of generative design in order to inspire future research. Our findings, in conjunction with the conceptual framework of Crismond and Adams' informed design teaching and learning matrix, can help position design educators to support engineering students to move from beginning to informed designers, even when engaging with new computational techniques and tools, including generative design.

Acknowledgment

We appreciate the work from our student participants, early generative design discussions with guidance from Dan Banach at Autodesk, and feedback from the reviewers that served to strengthen this manuscript significantly. Any opinions, findings, and conclusions or recommendations expressed in this material are my own and do not necessarily reflect the view of the funding bodies.

Funding Data

• National Science Foundation (NSF) under Grant DUE #2207408.

Conflict of Interest

There are no conflicts of interest.

Data Availability Statement

The datasets generated and supporting the findings of this article are obtainable from the corresponding author upon reasonable request.

References

1.
Crismond
,
D. P.
, and
Adams
,
R. S.
,
2012
, “
The Informed Design Teaching & Learning Matrix
,”
J. Eng. Educ.
,
101
(
4
), pp.
738
797
.
2.
Daly
,
S. R.
,
Adams
,
R. S.
, and
Bodner
,
G. M.
,
2012
, “
What Does It Mean to Design? A Qualitative Investigation of Design Professionals' Experiences
,”
J. Eng. Educ.
,
101
(
2
), pp.
187
219
.
3.
Dym
,
C. L.
,
Agogino
,
A. M.
,
Eris
,
O.
,
Frey
,
D. D.
, and
Leifer
,
L. J.
,
2005
, “
Engineering Design Thinking, Teaching, and Learning
,”
J. Eng. Educ.
,
94
(
1
), pp.
103
120
.
4.
Pucha
,
R. V.
, and
Utschig
,
T. T.
,
2012
, “
Learning-Centered Instruction of Engineering Graphics for Freshman Engineering Students
,”
J. STEM Educ. Innov. Res.
,
13
(
4
), p.
24
.
5.
Leake
,
J. M.
,
Goldstein
,
M. H.
, and
Borgerson
,
J. L.
,
2022
,
Engineering Design Graphics: Sketching, Modeling, and Visualization
,
John Wiley and Sons
,
Hoboken, NJ
.
6.
Mountstephens
,
J.
, and
Teo
,
J.
,
2020
, “
Progress and Challenges in Generative Product Design: A Review of Systems
,”
Computers
,
9
(
4
), p.
80
.
7.
Duffy
,
A.
,
Hay
,
L.
,
Grealy
,
M.
, and
Vuletic
,
T.
,
2019
, “
ImagineD: A Vision for Cognitive Driven Creative Design
,”
30th Anniversary Heron Island Conference on Computational and Cognitive Models of Creativity
,
Heron Island, Queensland, Australia
.
8.
Hay
,
L.
,
Cash
,
P.
, and
McKilligan
,
S.
,
2020
, “
The Future of Design Cognition Analysis
,”
Des. Sci.
,
6
, p.
e20
.
9.
Saadi
,
J. I.
, and
Yang
,
M. C.
,
2023
, “
Generative Design: Reframing the Role of the Designer in Early-Stage Design Process
,”
ASME J. Mech. Des.
,
145
(
4
), p.
041411
.
10.
Demirel
,
H. O.
,
Goldstein
,
M. H.
,
Li
,
X.
, and
Sha
,
Z.
,
2023
, “
Human-Centered Generative Design Framework: An Early Design Framework to Support Concept Creation and Evaluation
,”
Int. J. Hum. Comput. Interact.
, pp.
1
12
.
11.
Morse
,
J. M.
,
2003
, “Principles of Mixed Methods,”
Handbook of Mixed Methods in Social & Behavioral Research
,
A.
Tashakkori
and
C.
Teddlie
, eds.,
Sage Publications, Inc.
,
Thousand Oaks, CA
, Vol.
189
.
12.
Accreditation Board for Engineering and Technology (ABET)
, 2021, “Criteria for Accrediting Engineering Programs, 2021–2022,” https://www.abet.org/accreditation/accreditation-criteria/criteria-for-accrediting-engineering-programs-2021-2022/.
13.
European Network for Accreditation of Engineering Education
, “EUR–ACE Framework Standards and Guidelines,” https://www.enaee.eu/eur-ace-system/standards-and-guidelines/#standards-and-guidelines-for-accreditation-of-engineering-programmes.
14.
Simon
,
H. A.
,
1988
, “
The Science of Design: Creating the Artificial
,”
Des. Issues
,
4
(
1/2
), pp.
67
82
.
15.
Ye
,
X.
,
Peng
,
W.
,
Chen
,
Z.
, and
Cai
,
Y.-Y.
,
2004
, “
Today's Students, Tomorrow's Engineers: An Industrial Perspective on CAD Education
,”
Comput. Des.
,
36
(
14
), pp.
1451
1460
.
16.
Sorby
,
S. A.
,
2009
, “
Educational Research in Developing 3-D Spatial Skills for Engineering Students
,”
Int. J. Sci. Educ.
,
31
(
3
), pp.
459
480
.
17.
Brink
,
H.
,
Kilbrink
,
N.
, and
Gericke
,
N.
,
2023
, “
Teach to Use CAD or Through Using CAD: An Interview Study With Technology Teachers
,”
Int. J. Technol. Des. Educ.
,
33
(
3
), pp.
957
979
.
18.
Robertson
,
B.
, and
Radcliffe
,
D.
,
2006
, “
The Role of Software Tools in Influencing Creative Problem Solving in Engineering Design and Education
,”
International Design Engineering Technical Conferences and Computers and Information in Engineering Conference
,
Philadelphia, PA
,
Sept. 10–13
, pp.
999
1007
.
19.
Sreekanth
,
A. P.
, and
Viswanathan
,
V. K.
,
2019
, “
A Study on the Role of Computer-Aided Design in Design Creativity and Education
,”
Eng. Des. Graph. J.
,
83
.
20.
Karabiyik
,
T.
,
Magana
,
A. J.
,
Parsons
,
P.
, and
Seah
,
Y. Y.
,
2023
, “
Pedagogical Approaches for Eliciting Students' Design Thinking Strategies: Tell-and-Practice vs. Contrasting Cases
,”
Int. J. Technol. Des. Educ.
,
33
(
3
), pp.
1087
1119
.
21.
Froyd
,
J. E.
, and
Ohland
,
M. W.
,
2005
, “
Integrated Engineering Curricula
,”
J. Eng. Educ.
,
94
(
1
), pp.
147
164
.
22.
Taleyarkhan
,
M.
,
Dasgupta
,
C.
,
Garcia
,
J. M.
, and
Magana
,
A. J.
,
2018
, “
Investigating the Impact of Using a CAD Simulation Tool on Students' Learning of Design Thinking
,”
J. Sci. Educ. Technol.
,
27
(
4
), pp.
334
347
.
23.
Robertson
,
B. F.
,
Walther
,
J.
, and
Radcliffe
,
D. F.
,
2007
, “
Creativity and the Use of CAD Tools: Lessons for Engineering Design Education From Industry
,”
ASME J. Mech. Des.
,
129
(
7
), pp.
753
760
.
24.
Walther
,
J.
, and
Radcliffe
,
D.
,
2006
, “
Engineering Education: Targeted Learning Outcomes or Accidental Competencies?
,”
2006 Annual Conference & Exposition
,
Chicago, IL
,
June 4–7
, pp.
11
557
.
25.
Lawson
,
B.
,
2002
, “
CAD and Creativity: Does the Computer Really Help?
,”
Leonardo
,
35
(
3
), pp.
327
331
.
26.
Sanders
,
E. A.
,
Goldstein
,
M. H.
, and
Hess
,
J.
,
2023
, “
Instructional Characteristics That Promote and Inhibit More Comprehensive Ways of Experiencing Human-Centered Design
,”
Int. J. Technol. Des. Educ.
, (in press).
27.
Smetana
,
L. K.
, and
Bell
,
R. L.
,
2012
, “
Computer Simulations to Support Science Instruction and Learning: A Critical Review of the Literature
,”
Int. J. Sci. Educ.
,
34
(
9
), pp.
1337
1370
.
28.
McKnight
,
M.
,
2017
, “
Generative Design: What It Is? How Is It Being Used? Why It's a Game Changer
,”
KnE Eng.
,
2
(
2
), pp.
176
181
.
29.
Sigmund
,
O.
, and
Maute
,
K.
,
2013
, “
Topology Optimization Approaches: A Comparative Review
,”
Struct. Multidiscip. Optim.
,
48
(
6
), pp.
1031
1055
.
30.
Wang
,
Z.
,
Zhang
,
Y.
, and
Bernard
,
A.
,
2021
, “
A Constructive Solid Geometry-Based Generative Design Method for Additive Manufacturing
,”
Addit. Manuf.
,
41
, p.
101952
.
31.
Krish
,
S.
,
2011
, “
A Practical Generative Design Method
,”
Comput. Des.
,
43
(
1
), pp.
88
100
.
32.
Hay
,
L.
,
Duffy
,
A. H. B.
,
McTeague
,
C.
,
Pidgeon
,
L. M.
,
Vuletic
,
T.
, and
Grealy
,
M.
,
2017
, “
A Systematic Review of Protocol Studies on Conceptual Design Cognition: Design as Search and Exploration
,”
Des. Sci.
,
3
, p.
e10
.
33.
Cross
,
N.
,
1982
, “
Designerly Ways of Knowing
,”
Des. Stud.
,
3
(
4
), pp.
221
227
.
34.
Guilford
,
J. P.
,
1956
, “
The Structure of Intellect
,”
Psychol. Bull.
,
53
(
4
), pp.
267
293
.
35.
Runco
,
M. A.
, and
Jaeger
,
G. J.
,
2012
, “
The Standard Definition of Creativity
,”
Creat. Res. J.
,
24
(
1
), pp.
92
96
.
36.
Acar
,
S.
, and
Runco
,
M. A.
,
2019
, “
Divergent Thinking: New Methods, Recent Research, and Extended Theory
,”
Psychol. Aesthet. Creat. Arts
,
13
(
2
), pp.
153
158
.
37.
Shah
,
J. J.
,
Smith
,
S. M.
, and
Vargas-Hernandez
,
N.
,
2003
, “
Metrics for Measuring Ideation Effectiveness
,”
Des. Stud.
,
24
(
2
), pp.
111
134
.
38.
Li
,
X.
,
Wang
,
Y.
, and
Sha
,
Z.
,
2023
, “
Deep Learning Methods of Cross-Modal Tasks for Conceptual Design of Product Shapes: A Review
,”
ASME J. Mech. Des.
,
145
(
4
), p. 041401.
39.
Cropley
,
A.
,
2006
, “
In Praise of Convergent Thinking
,”
Creat. Res. J.
,
18
(
3
), pp.
391
404
.
40.
Zabelina
,
D. L.
, and
Silvia
,
P. J.
,
2020
, “
Percolating Ideas: The Effects of Caffeine on Creative Thinking and Problem Solving
,”
Conscious. Cogn.
,
79
, p.
102899
.
41.
Goldschmidt
,
G.
,
2016
, “
Linkographic Evidence for Concurrent Divergent and Convergent Thinking in Creative Design
,”
Creat. Res. J.
,
28
(
2
), pp.
115
122
.
42.
Akin
,
Ö
, and
Akin
,
C.
,
1996
, “
Frames of Reference in Architectural Design: Analysing the Hyperacclamation (Aha-!)
,”
Des. Stud.
,
17
(
4
), pp.
341
361
.
43.
Guilford
,
J. P.
,
1967
, “
Creativity: Yesterday, Today and Tomorrow
,”
J. Creat. Behav.
,
1
(
1
), pp.
3
14
.
44.
Ludmer
,
R.
,
Dudai
,
Y.
, and
Rubin
,
N.
,
2011
, “
Uncovering Camouflage: Amygdala Activation Predicts Long-Term Memory of Induced Perceptual Insight
,”
Neuron
,
69
(
5
), pp.
1002
1014
.
45.
Goldstein
,
M. H.
,
Sommer
,
J.
,
Buswell
,
N. T.
,
Li
,
X.
,
Sha
,
Z.
, and
Demirel
,
H. O.
,
2021
, “
Uncovering Generative Design Rationale in the Undergraduate Classroom
,”
2021 IEEE Frontiers in Education Conference (FIE)
,
Lincoln, NE
,
Oct. 13–16
,
IEEE
, pp.
1
6
.
46.
Beaty
,
R. E.
, and
Johnson
,
D. R.
,
2021
, “
Automating Creativity Assessment With SemDis: An Open Platform for Computing Semantic Distance
,”
Behav. Res. Methods
,
53
(
2
), pp.
757
780
.
47.
Wrigley
,
C.
, and
Straker
,
K.
,
2017
, “
Design Thinking Pedagogy: The Educational Design Ladder
,”
Innov. Educ. Teach. Int.
,
54
(
4
), pp.
374
385
.
48.
Jang
,
J.
, and
Schunn
,
C. D.
,
2012
, “
Physical Design Tools Support and Hinder Innovative Engineering Design
.”
49.
Purzer
,
Ş
,
Goldstein
,
M. H.
,
Adams
,
R. S.
,
Xie
,
C.
, and
Nourian
,
S.
,
2015
, “
An Exploratory Study of Informed Engineering Design Behaviors Associated With Scientific Explanations
,”
Int. J. STEM Educ.
,
2
(
1
), pp.
1
12
.
50.
McKenna
,
A. F.
,
2007
, “
An Investigation of Adaptive Expertise and Transfer of Design Process Knowledge
.
51.
Sanders
,
E. A.
,
Goldstein
,
M. H.
, and
Hess
,
J. L.
,
2021
, “
Assessing Ways of Experiencing Human-Centered Design Via Student Reflections
,”
2021 ASEE Virtual Annual Conference Content Access
,
Virtual
.