VU AI Show & Share
My Practical AI Workflow for Education and Research
As a researcher and educator at the University of Amsterdam, I have worked with RStudio for over 15 years. But after trying out Positron, the next-generation IDE built by Posit, I haven’t looked back since.
In my session at the VU AI show & share event, I tempted the demo gods and used live AI tools to generate my presentation on the spot. This isn’t about using AI for the sake of novelty; it’s about a fundamental shift in how we create content, supervise students, and conduct research.
Generating and deploying Presentations
I made a choice back in 2015 to stop using PowerPoint. Instead, I moved to a workflow powered by Quarto and reveal.js, which allows me to generate high-quality, HTML-based presentations in markdown format.
Generating presentations.
In my current practice, I use Claude integrated directly into Positron. I provide the LLM with my session abstract, and it structures a complete slide deck for me in seconds. These slides are then deployed via GitHub Pages. It’s important to understand that pushing to GitHub isn’t just a backup, it is the publishing act. If a student identifies a typo mid-lecture, I can edit the source, push the change, and the update is live the moment they refresh their browser.
While Quarto allows me to output to PDF or PowerPoint if needed, the HTML format offers advantages static files simply cannot match:
Deploying to GitHub.
- Live Interactivity: I can embed live YouTube videos or interactive WebGL images, such as a 3D neuron that I can rotate and manipulate during the talk.
- Dynamic Formulas: Complex mathematical notations are rendered natively and perfectly, ensuring accuracy across all devices.
- Instant Version Control: Every iteration of the lecture is tracked, providing a transparent history of how the material has evolved.
- Student Accessibility: Students don’t have to struggle with downloading massive .pptx files from a Virtual Learning Environment like Canvas; they simply scan a QR code or click a link to access a lightweight, mobile-friendly site.
- Universal Design for Learning: The HTML format allows for better accessibility features, such as screen reader compatibility and adjustable text sizes, making the content more inclusive for all students.
Open Educational Resource Textbooks
For our “Statistical Reasoning” course, we developed an Open Access book using this same framework. Because the LLM in Positron has context-aware access to my entire project folder, it acts as a true writing partner. It can suggest text that matches my specific tone, identify images within subdirectories, and automatically generate the correct Markdown references.
Writing support.
This transition from a “static resource” to a “living document” is best illustrated by a student engagement I’ll never forget:
“I had one student from Italy who forked my repository, corrected a typo, and submitted a pull request. This is the power of open access: the students themselves can help improve the resource for the next cohort.”
By hosting the source code on GitHub, the book becomes a collaborative effort. We even integrate interactive Shiny apps directly into chapters, allowing students to run simulations and pull real-time data samples as they read.
Thesis Supervision
The “product-focused” model of supervision—waiting for a final PDF to land in your inbox—is increasingly fragile in the AI age. My philosophy has shifted toward monitoring the process. I’ve moved my supervision entirely to Microsoft Teams, eliminating the black hole of email threads.
Using teams for supervision.
Process Over Product
The key technical step most people miss is having the student “Add a shortcut to OneDrive” for their specific Teams folder. This ensures their local working files are constantly synced to the cloud environment I can access. This isn’t about being a “police officer”—it’s about being an active guide. By checking the Version History in Word, I can see the gradual development of their ideas. If a massive block of text appears instantly without a history of incremental edits, it’s a signal to have a conversation about their AI usage early, rather than discovering a problem at the final submission.
The Supervision Shift
| Traditional Supervision (Product-Focused) |
AI-Age Supervision (Process-Focused) |
|---|---|
| Feedback on sporadic, static draft submissions. | Continuous visibility into the active writing process. |
| Communication siloed in fragmented email chains. | “Always-on” communication via Teams tagging and chat. |
| Reviewing only the final “output” or “result.” | Monitoring Version History to track the evolution of thought. |
| High risk of undetected, late-stage AI plagiarism. | Early intervention when “large chunks” of text appear suddenly. |
Research Workflows and the Rise of Databots
In my research, I am currently experimenting with the Databot add-on within Positron for what I call “language-based statistics”. You can start up a session and just talk to the bot in natural language, asking it to load data, run analyses, and generate visualizations. The most “next-level” aspect of this workflow is the bot’s ability to self-correct. If the initial R code generates an error, the bot doesn’t just give up; it analyzes the R console output, diagnoses the mistake, and iterates on the code until it works. At the end it spits out a full markdown report with all the code and output, which you can then render to HTML, Word, or PDF.
AI data analysis in action.
Steps to a Markdown Report via Databot:
- Load Data: Point the bot to a data file in your project folder; it identifies the format, imports it, and summarises its structure.
- Propose Analysis: The bot suggests relevant next steps based on the data. You approve, reject, or refine the proposed analyses and request additional visualisations or models.
- Execute & Self-Correct: The bot runs the code, monitors the R console, and iterates automatically when errors occur—whether a missing package, a misnamed variable, or a type mismatch.
- Generate Report: Ask the bot to compile everything into a report; it returns the full source text and code needed to render to HTML, Word, or PDF.
This shift means we could stop teaching “button-pushing” in SPSS. Instead, statistics becomes like a language course: students must learn to read the code and critically verify the output.
Critical Considerations: Security and Skillsets
Adopting these tools requires a professional approach to security and a clear understanding of the evolving academic skillset:
- The “Data Vault” vs. Local LLMs: When handling sensitive research data, you must be careful. If you aren’t using a secure, university-managed API, you should run local LLMs to ensure that no private data ever leaves your premises for a public cloud.
- Teacher Access & Verification: Educators can access professional-grade AI tools through GitHub Copilot (free for verified educators) or via a university-provided API key, which may offer additional institutional support and data governance.
- Critical Evaluation: AI doesn’t replace the need to understand the underlying science. It shifts the required skill from “calculating” to critical evaluation. You must know enough about the math to recognize when the AI’s “self-fixed” code has hallucinated a result.
Closing Thoughts
Integrating these tools has saved me an immense amount of time. While the learning curve exists, the move toward a transparent, version-controlled, and automated workflow is the way forward for modern academia. I invite you to begin your own transition to a more AI-integrated future.
Resources Mentioned
- Positron: The new, AI-integrated IDE for R and Python.
- Quarto: The open-source system for technical publishing (HTML, PDF, PPT).
- GitHub Copilot for educators: AI pair programming (free for verified educators with a redacted paystub).
- Databot: Specialized add-ons for language-based data analysis.