Feed

Including a good example of a @Transkribus to @nodegoat workflow 👇

First results of the #nodegoat project "The Influx of Musicians to the Slovene Lands in the Long 19th Century" of Maruša Zupančič from @ZrcSazu presented today at the "Musical Networking in the long 19th Century" conference organised by @HIPZagreb

RT @phn_larhra: #D4H21 Session 6 - Uncertain Time and Space
Pim Van Bree, Geert Kessels (@LAB1100)
Chronology Statements for nodegoat: a te…

RT @ChrWachter: Unsichere Zeit- und Ortsangaben modellieren und per #GIS darstellen? Pim van Bree und Geert Kessels über ein wichtiges Prob…

Next week we’ll give a #nodegoat workshop during the online conference ‘Musical Networking in the long 19th Century’

Programme and info on live stream can be found here: http://hmd-music.org/

Join the #nodegoat Day @unibern organised by @kaspargubler to learn how @DH_unibe runs their nodegoat Go service.

Plus presentations from @buercky, @photopraline, Giulia Iannuzzi, Daniel Jaquet, @nina_janz, @Stefanie_Mahrer, & @NunoCamarinhas.

Sign up 👉 https://histdata.hypotheses.org/2104

RT @SBvanderVeen: Happy to have been part of a very interesting 13th Contact Day #JewishStudies on the Low Countries of the Antwerp Institu…

RT @gerthuskens: Next week, I will be talking at 4th @NDH_Network Conference about the use of digital humanities (@nodegoat) and network an…

RT @enlightenedpm: Great workshop by @kaspargubler , Geert Kessels and Pim van Bree (of @LAB1100 ) on @nodegoat and #linkedopendata with @l…

RT @photopraline: @nodegoat ist wirklich ein mächtiges Tool für historische Netzwerkforschung! Gerade gelernt, wie man die GNDs richtig mit…

RT @photopraline: Today: hands on workshop with @nodegoat
Love it! Step by step instructions and many insider infos about data modeling.

RT @elisarossberger: Second day of introduction to @nodegoat workshop series today, organized by @kaspargubler. What a convenient & smart a…

RT @HistEdGER: ich mich auf die Erzdiözese #Köln. Es handelt sich um eine personengeschichtliche Studie, für die ich mit einer #Nodegoat-Da…

RT @DrAGrabowski: Making database of Caesarius of Heisterbach's contacts based on VIII Libri miraculorum is surprisingly enjoyable (and Nod…

RT @hchesner: Abby is talking about using @nodegoat for mapping connections between people (in the case of the blog above, showing Henriett…

RT @angelalinghuang: Phew! Another intense @euroweb4 day, developing our "Digital Atlas of Textile Heritage" further 🥳, discussing geocodin…

Session 4: Learn how to ingest data from other data sources. These additional datasets can be related or unrelated to the data in your environment:

E.g.: query the @KBNLresearch SPARQL endpoint using #VIAF IDs to ingest publications of the people in your database.

Session 3: Learn how to set up Linked Data Resources and how to configure Ingestion processes. We will ingest data that enriches the Objects in your #nodegoat environment.

E.g.: query @wikidata using @gndnet IDs and store the returned biographical data in your environment.

Session 2: In this session we will import a CSV file that contains a subset of the medieval scholars that are part of the @RAG_online.

We will also import spatial and temporal data to explore spatial and temporal filters and visualisations.

Session 1: Learn how to implement a data model in your own nodegoat environment. Once the data model is up and running, we will manually enter a small amount of data to evaluate your data model.

Request your #nodegoat account here: https://nodegoat.net/requestaccount

Virtual #nodegoat Workshop series organised by the @snsf_ch SPARK project "Dynamic Data Ingestion"
• 28-04-2021 Data Modelling
• 05-05-2021 Importing Data
• 12-05-2021 Ingesting Biographical Data
• 26-05-2021 Ingesting Related Data

Hosted by @kaspargubler of @unibern