Concepedia

Publication | Open Access

Using Mobile Location‐Based Augmented Reality to Support Outdoor Learning in Undergraduate Ecology and Environmental Science Courses

47

Citations

34

References

2018

Year

Abstract

Augmented reality (AR) applications use a technological device to visually display digital information so that it appears to be overlaid, embedded in, or activated by the physical environment. Augmented reality is an emerging technology that falls on a spectrum between real and virtual (Milgram and Kishino 1994); current descriptors of points along this spectrum include mixed reality, AR, and virtual reality (Klopfer 2008, Liu et al. 2017). While AR is currently most visible in entertainment and gaming industries (e.g., Pokemon Go), there is growing theoretical and empirical evidence that AR supports learning and engagement (Price and Rogers 2004, Dede 2009, Radu 2014; Reilly and Dede, in press), and it is important that environmental educators for all ages consider the opportunities and challenges these technologies present (McCauley 2017). Because AR is an emerging medium, there are a number of formats that fall under the AR “umbrella,” and we describe the relevant distinctions below. There are two primary formats for AR—location-based and vision-based AR—and each offers different opportunities to support learning (Dunleavy 2014, Dunleavy and Dede 2014). There are also two primary modes for delivering AR experiences—through mobile devices (like smartphones and tablets) or through head-mounted displays (like the Microsoft HoloLens; Radu 2014). Vision-based AR allows a designer to link digital information and media with a physical “trigger,” which might be an object, image, or Quick Response (QR) code (like the black-and-white square shown in Fig. 1a). The camera(s) on a smartphone or tablet, or on a head-mounted display (like Microsoft HoloLens), are used to recognize the pattern of the trigger and activate the associated information and media, which is then displayed to the user. This works in a similar way to a barcode that is scanned at a grocery store in order to reveal the price of an item. In contrast, location-based AR involves learners using GPS-enabled smartphones or tablets to activate media at particular locations in an outdoor space (Fig. 1b). A designer uses a map-based online interface to embed digital information and media at locations of interest, and the embedded information or media is activated when the user reaches that location. After being activated by location or by a vision-based trigger, the AR application superimposes digital media, data, audio, video, art, and/or narratives on the real world, making this information appear to be embedded or overlaid on the real environment. Prior work on the use of AR in undergraduate teaching and learning contexts has focused largely on vision-based applications of AR. Studies in this area show that vision-based AR can support student understanding of concepts that require abstraction or interpretation of complex spatial relationships—concepts for which visualization is a useful tool (Radu 2014). For example, work by Lin et al. (2013) demonstrates that undergraduate students learned more about elastic collisions using an AR application for physics compared to a 2D simulation, while Shelton and Hedley (2002) report on improvements among undergraduate geography students in their factual and conceptual understandings of complex spatial concepts associated with Earth–Sun relationships following use of an AR display. Also, when applying vision-based AR to physics and astronomy laboratories, the AR treatment had positive effects on students’ attitudes, skills, and conceptual understanding-related specific concepts (Yen et al. 2013, Akçayır et al. 2016). While these studies represent a valuable “proof of concept,” more work needs to be done to characterize how AR interfaces may support learning in undergraduate classrooms, and to identify the limits in the utility of AR. One limitation of the prior work is that much of this has been done in indoor learning environments with vision-based AR, and the potential for location-based AR to support learning among young adults in outdoor contexts has been under-studied. Studies in mixed-reality contexts suggest that overlaying digital information with real physical objects can help support transfer by bridging abstract and concrete forms of understanding (Quarles, Lampotang, Fischler, Fishwick, and Lok 2008). Augmented reality offers similar opportunities to bridge between the abstract and concrete (Rogers 2004), and for learners to build deeper connections with the material by learning about otherwise hidden physical, historical, and cultural aspects of the outdoor space (Zimmerman and Land 2014, Kamarainen et al. 2015). For reasons of practicality and cost, this article focuses on AR experiences that are accessible through mobile devices (like smartphones and tablets), rather than those that require a head-mounted display. We present examples that use a combination of location-based and vision-based triggers. As described below, the affordances of mobile and location-based AR align most closely with learning goals relevant to ecology and environmental science. Ecologists and ecosystem scientists bring to bear sophisticated conceptual models and background knowledge when they observe or study natural systems (Eberbach and Crowley 2009; Kamarainen and Grotzer, in review). These perspectives provide scientists with a “search image” that can help them identify patterns, notice things that are unusual, pay attention to relevant signals, and connect their observations to prior understanding of natural history. This leads to the question: “How might AR support learners in seeing the world through an ecologist's eyes?” While AR platforms and experiences have only recently reached a level of maturity that makes broad use in learning contexts feasible, research using early versions and prototypes provides compelling arguments that well-designed AR experiences support learning. (A short review of this active area of research is provided below.) Augmented reality can support ecology learning through revealing hidden or invisible aspects of the system, and linking visualization of hidden processes with macro-scale or emergent outcomes. Dunleavy (2014) refers to this as using AR to “see the unseen” and highlights this as a general principle for designing impactful AR for learning. Prior research supports the idea that prompting students to notice and reflect upon processes that are responsible for patterns and change in natural systems can support shifts in student thinking from static or event-based notions of causality toward process-based explanations (Lindgren and Moshell 2011, Grotzer et al. 2013). There are a number of ways AR can be leveraged to support these outcomes. Augmented reality locations can be positioned at places where one wants students to observe an object, pattern, or phenomenon that they might not otherwise notice—for example, a nesting cavity created by a woodpecker, the layer of silt left behind after a spring flood, or a path cut through brush by a deer. As described by Eberbach and Crowley (2009), engaging in scientific observation is a more challenging skill than is generally appreciated, and requires the coordination of disciplinary knowledge, practices of observation, and the application of one's attention. Novices often do not know what to look for, so may quickly give up or overlook interesting artifacts. Augmented reality can be used to alert students to an opportunity to observe something meaningful that connects with ideas they have been learning; and tips, reminders, and reflective prompts embedded in the experience can encourage students to adopt practices of observation that mirror those used by experts (Klopfer and Squire 2007, Dunleavy and Dede 2014, Grotzer et al. 2015). Augmented reality can also be used to overlay or link multiple representations of a system or phenomenon (Zimmerman and Land 2014, Kamarainen et al. 2016). Many ecological changes are driven by organisms or processes that are too small to see, while emergent patterns or outcomes may be best visualized from a bird's-eye perspective. When making sense of scientific visualizations and abstractions, students benefit from viewing multiple forms of representation (Ainsworth 1999, Wu and Shah 2004), manipulating and interacting with physical models as well as visual representations, engaging in metacognition and reflection related to the visualizations (Chang et al. 2009, Wu and Shah 2004), and making links among representations (Wu and Shah 2004). Augmented reality can support this by juxtaposing multiple representations or overlaying them with a physical pattern or phenomenon, which allows the user to easily connect and compare the representations without having to hold one in mind while accessing a second (Tang et al. 2003, Pathomaree and Charoenseang 2005, Radu 2014). Augmented reality visualizations can be used to communicate changes over time by embedding views or narratives that describe the history of a place (Zimmerman and Land 2014, Grotzer 2015), or by pointing to evidence of change over time that may be present within the landscape. Ecosystems can have long “memories”—the abiotic conditions, species composition, and relationships present may depend on what happened at the site last season or long ago. Through repeated visits and careful observation, ecologists often develop deep knowledge of the history of a place, its rhythms, and its phenology. Using AR to provide visitors with time lapse, before-and-after, or “what if it were winter” views of a space that they may only visit once can provide a powerful shift in perspective. For example, residents near a community park that was the location of one of our augmented field trips had fortuitously installed a hidden video camera and collected footage of nocturnal and diurnal visitors to the park. We embedded a highlight reel from the camera in the AR experience and activated the video when students arrived at the tree in which the camera had been mounted. Students watched as the nighttime footage showed a fox passing through, a pair of raccoons climbing the tree, and a coyote urinating near the base of the tree to mark its territory. The next clip showed a daytime scene of a neighborhood dog sniffing the location and adding his own scent to the mix. The AR experience prompted students to consider that there are many organisms that frequent the park even though they may not be seen during the daytime field trip. These ways of making the invisible visible help students to see an environment through an ecologist's eyes. How complex is developing these kinds of learning experiences? The basic architecture of location-based or vision-based AR platforms includes two parts: (1) an online editor that the designer uses to upload media, link those media with GPS locations or visual QR codes, and orchestrate the sequence of events the user will engage with during the experience; and (2) a client-side application downloadable to a mobile device that allows the user to login and access the experience the designer has constructed. There are a number of location-based AR platforms that allow you to design your experience for free, and most also allow you to share the experience with unlimited users for free (e.g., ARIS, TaleBlazer, Aurasma), though some require a fee to use the experience with more than one user (e.g., FreshAiR) (see a list of experiences and platforms in Dunleavy and Dede 2014). A number of these platforms have recently been applied to undergraduate learning contexts. Klopfer and Squire (2008) summarize how a predecessor to TaleBlazer was used to create a mobile AR game called Environmental Detectives that supported undergraduates in an environmental science course. More recently, work by Clements (2017) outlines how the TaleBlazer AR platform was used to design a guided tour of a canyon for an undergraduate Physical Sciences course. and an of the use of to support learning by undergraduate This of work valuable for the design of location-based AR for location-based AR platform has and they share similar design that a designer to embed media and link those media with location-based or vision-based triggers. A designer can upload different forms of media audio, and the online editor and then embed these in the experience by linking them with particular locations or (e.g., a are of the of the experience (Fig. and can be used to locations and media on and so for example, the user at the and up a device they will see a location on their display. In its the experience may as a of on a virtual tour of the to a tour guided by a virtual the platform also allows a designer to use and to develop compelling and experiences that engage learners in and ways Reilly and Dede, the design is the experience be it accessible to the user through the application on their mobile device (see Fig. for an of the user A number of platforms provide experiences by a designer to or an experience from one location to This is a and way to it is important to consider the between the experience and the place that it will be AR experiences are in a way that is the virtual aspects of the experience to environmental (like or that the experience may be to a location without the of the learning In contrast, experiences are to be to specific and of the environment that may not be in locations (e.g., a on the Kamarainen et al. 2015). When using it is to consider and how the design of the experience with the of your location. an experience is way to a AR experience can allow you to observe how the experience appears in the design and user interfaces in order to the among the you have the you can to bring your own ideas the by and media, the adding or the order and sequence of user of the AR platforms TaleBlazer, FreshAiR) has online and to support These it to a or design and build your own AR experience from In an way to use AR in your undergraduate be to engage your students in the design of their own AR suggest that engaging students in design is a powerful way to support learning it students in by and prompts them to multiple and the design can learners a way of and that during the design opportunities for and 2007, et al. 2013). Prior work outlines of engaging as young as in AR design deeper and understanding of in their community (Klopfer and et al. et al. 2014, and 2014). the of AR and their of the design of AR experiences may be an way to engage undergraduate learners in deeper understanding of the environments may a in which students are to video, or that and how different on change over time over the of a a or a might embed media they have collected in an AR experience that can be with the of the or even the As students use and the AR experiences created by their they might understanding of diurnal patterns, or and more how ecological processes and in the that was a research by the which focused on the utility of AR experiences with virtual environments for ecosystem science Through a of we and a number of that focused on different physical and technological et al. 2013, Grotzer et al. Kamarainen et al. 2016). this of we used the platform and to a number of location-based AR experiences with associated that provide a of of experiences from to We have these and that will and them to best with the of their own the design we provide a of a of the experiences that are is to help students the of in with a on the of of and the processes of and that students to the of and to sense of processes that are not visible in which to in about and et al. and in understanding these processes undergraduate classrooms, by of the between and and these can be to et al. et al. the students to a or through their environment. The as of a and students the of through by or of the through and an is from a The as of a within a and after being by a virtual the material to the of the and is used by The is and makes its way the and this is up by a and in with the to being a of a in a different The AR provides ways to engage students in active and learning that have been shown to be in student learning (Price and Rogers 2004). Students the environment the of and Physical of the of students a more of material (Price and Rogers 2004, and Moshell Through engaging students in this and the design to give a sense that and though not are all as well as a physical sense of the for of in the environment. Students are prompted to look for and notice of the environment (e.g., look for notice and of and that connect with the of their virtual (Fig. A QR code to the of a tree an of by of and to and within a (Fig. Through of multiple and with real objects can and otherwise hidden processes and the through multiple that the in different students of may change to and a of different physical it is (Fig. The includes that the ideas that are through the augmented field In students might build their own of the or on their We used the by and to help students about the and of in the of and The provides a way of the of in an thinking about a and where it is over it might from one place to it might or in the location and in its or it may or be from one to through a physical, or The design of is that the experience can be used in place that has and some We to use representations of and which are compared to current understanding of these representations are for students to and the The experience students through a for and the number and of shown is this experience not represent the and patterns of and that characterize and is to reveal hidden processes in the and of in in on these learning it aspects related to the of while One potential might be to engage students in designing that be the that over different time or related to different (e.g., or engaging students in of the they be prompted to consider ways in which the experience the and of and in what ways it be to represent what is currently about material in powerful of this be to and through the processes that are in the current of This help in undergraduate learners related to understanding how is used and during basic ecological processes and et al. The is as a of a that students in a virtual world called et al. 2013). In the students a virtual a and information and evidence to build an about what happened to the The experience students to a real or to a is to in the real The AR leads students to visit a to up environmental might include or (Fig. in students the area and on the of a or (Fig. Through prompts embedded in the AR they share and compare their to those collected by their and evidence about that might their The of this experience is to support students in their own data, and a that the and of natural The location-based AR students to a location where they can up a that will allow them to of their environment. In to students the space and the AR experience and about how to use the This support can allow students to the at their own through the and and at location they We have that students an experience that is to a between and After having collected from multiple locations the students are to a where they their on and them to number and (Fig. As more students visit the patterns and students can see in the of or can notice spatial patterns in the on the multiple representations of the can help the students and connect the in with physical of the environment. In the of the we use the affordances of AR to help students engage in deeper interpretation of their have to visit of the locations that is interesting to them and can engage with each of the The experience a number of learning goals that are relevant to undergraduate Students engage in to scientific as the experience students to use environmental to their own These are then displayed in a visual representation that allows the and of from multiple and In our we used physical visual displays in the you might using a to student and it for can be leveraged to support students in and applying to sense of the spatial and patterns in these of what makes AR a powerful tool is that it allows the to support student learning of the of the and to do so in ways that connect with concepts or practices in the course. can use the AR to reflection and metacognition by the user to consider how what they are in the real world connects with what they are learning in These will be most if students are opportunities to engage in reflection and metacognition in during the course. way that these AR can be be to bring from the field experience the for or may include data, or We a combination of to the support about the behind and also supported students in conceptual models material or et al. Kamarainen et al. 2016). the supports and each experience is different and the of experiences can be leveraged in the to support learning that is and Augmented reality will be most powerful when it is the of in ways that and connect with knowledge and practices in The experiences were through an design that and the experiences on with students and et al. 2013, 2016). While the that are are to the and of these experiences be to an Many of the concepts by these (e.g., of understanding the making sense of are even for undergraduate students and align with learning goals in the and in for the of 2013). the learning outcomes that are in the using AR with your undergraduate students help you connect the concepts you are in with outdoor environments that students through Students often evidence that what they are learning about connects to their learning experiences in the places they through each can the you use AR to give your students so they can see in the tree near the to the they may of it each time they the tree on their way to The idea of your students in the design has the potential to support learning and engagement and Klopfer et al. The AR platforms for and students are to up with ways to engage and communicate with one of that a powerful or to the way the ideas are by the The AR platform offers ways to experiences that have been so that the easily review the students’ experience to on and Augmented reality makes it to design multiple field trips with different to as an not to the associated with a field or time of to students during the The of undergraduate students have their own so an AR experience at their own on their own even if the location is not on In ecologists and ecosystem scientists with them a of background knowledge when they natural systems and Grotzer, in and students only access to a small of this through with these and knowledge through the use of AR has the potential to support deeper understanding of concepts in the and more meaningful engagement with the environments research was supported by and by the was supported by the of of Sciences through and or in this material are those of the and do not reflect the views of the the of or

References

YearCitations

Page 1