eROBSON learning scenario design template

This learning scenario template is to be used in an interactive workshop with teachers familiar with Educational Robotics (ER) and without or with limited background in Augmented Reality (AR). In the workshop design support will be provided by AR and/or ER experts.

Participating teachers get a short introduction in the eROBSON approach of ER support with AR and a demonstration of two eROBSON tools (ARTutor and MirageXR) in combination with a hands-on experience with both tools. eROBSON experts will support designing initial scenarios for their own educational contexts. 

Such scenarios can be designed to a) substitute current ER teaching scenarios (i.e., reaching the same instructional goals as in live /current teaching without modifications or improvements; b) introduce modifications (slight functional improvements); c) substantially modify and improve current scenarios of teaching ER (based on Puentedura, 2010, SAMR-model). 

Principles of Successive Approximation model (SAM) and Design Thinking (DT) are taken as the backbone of instructional design methodology (see Attachment for details on these approaches and on an introduction to ER, AR and some examples).

Cite as:

Giel van Lankveld, Olga Firssova, Nardie Fanchamps, Dimitris Karampatzakis, Gregory Milopoulos, Iliana Ntovolou, Gene Bertrand, Krist Shingjergji, Roland Klemke, Corrie Urlings, and Mikhail Fominykh: eROBSON learning scenario design template (2024). eROBSON Consortium. https://e-robson.eu/

eROBSON design methodology

This document presents an original methodology for teaching Educational Robotics at schools in an online and blended formats using Augmented Reality. The eROBSON methodology supports design of Augmented Reality-enriched use of Educational Robotics in STEM contexts. In a systematic design, the Successive Approximation Model (SAM model) is combined with Design Thinking approach. The SAM model provides a simplified version of the well-known ADDIE instructional design methodology. A teacher is the problem owner, experts in Educational Robotics and Augmented Reality support and guide throughout the preparation, design and development phases. Teacher training is a live activity with a possible online component.

Cite as:

Giel van Lankveld, Olga Firssova, Nardie Fanchamps, Dimitris Karampatzakis, Gregory Milopoulos, Iliana Ntovolou, Gene Bertrand, Krist Shingjergji, Roland Klemke, Corrie Urlings, and Mikhail Fominykh: eROBSON design methodology (2024). eROBSON Consortium. https://e-robson.eu/

eROBSON co-design canvas

The eROBSON design canvas is a poster-size document designed to facilitate a collaborative process of eROBSON learning scenario design, which can be used together with the template document.

Cite as:

Giel van Lankveld, Olga Firssova, Nardie Fanchamps, Dimitris Karampatzakis, Gregory Milopoulos, Iliana Ntovolou, Gene Bertrand, Krist Shingjergji, Roland Klemke, Corrie Urlings, and Mikhail Fominykh: eROBSON co-design canvas (2024). eROBSON Consortium. https://e-robson.eu/

eROBSON affordance cards

In order to facilitate the design phase of the proposed methodology, we developed a set of cards. These cards are meant for teachers and instructional designers, who might have expertise in either ER or AR. Each card represents a different AR affordance in the context of ER. A designer of an AR-ER activity can use these cards to learn about the possibilities of an AR-ER system and to connect a scenario design to the technological affordances.
The cards illustrate multiple properties of AR in an AR-ER system. First, they illustrate types of displayed content and user interface. An AR app can show 3D model of ER components in an AR space and other media content in the same AR space. An AR app can also display user interface elements and other media content in the screen space (not in the AR space). An AR app can also visualize components of a simulated environment, such as noise level, ambient temperature, and similar. Second, the cards illustrate different interaction types and modes. Interaction is depicted on the cards with the hand elements. The user can interact with AR content via the user interface of the AR device or can interact with the physical elements – an image marker or a physical ER object. An AR app can detect a physical interaction (e.g., between a physical ER component and the physical environment) and support a simulated interaction (e.g., between a sensor type ER component and a simulated environment state). Third, the cards distinguish between the physical and the AR-visualized elements of the AR-ER experience. The cards that illustrate how an AR app can work with physical ER components are colored in blue, while those that show how an AR app can work with AR-simulated ER components are orange. Finally, the cards illustrate different recognition types. Both marker-based AR and marker-less AR are illustrated with the marker element under an ER component. Object recognition is used when physical ER components are to be augmented.

Cite as:

Mikhail Fominykh, Chiara Santandrea, and Eleonora Nava: eROBSON affordance cards (2024). eROBSON Consortium. https://e-robson.eu/