Deb Verheven, Deakin University
Conal Tuohy and Richard Rothwell, VeRSI
Ingrid Mason, Intersect Australia
Richard Rothwell presenting. I've previously heard Ingrid Mason talk about HuNI at NDF2012.
Idea of a virtual laboratory as a container for data (from variety of disciplines) and a number of tools. But many existing tools are like virtual laboratories themselves, often specific to disciplines.
Have a .9EFTS ontologist. Also project manager, technical coordinator, web page designer, tools coordinator and software developer.
Defined project as linked open data project. Humanities data into HuNI triple store (using RDF), embedded in HuNI virtual lab to create user interface. Embellishments include to provide linked open data in SPARQL, and publish via OAI-PMH; and to use AAF (Shibboleth) authentication; to use SOLR search server for virtual lab.
Have ideas of research use-cases (basic and advanced eg SPARQL queries) and desired features, eg custom analysis tools. The challenge is to get internal bridging relationships between datasets and global interoperability. Aggregating doesn't solve siloisation.
"Technology-driven projects don't make for good client outcomes."
Q: What response from broader humanities community?
A: Did some user research, not as much as wanted. Impediment is that when building database tend to have more contact with people creating collections than people using them. Trying to build framework/container first and idea is that researchers will come to them and say "We want this tool" and they'll build it. Funding set aside for further development.
Q: You compared this to Galaxy, but you've built from ground-up where Galaxy is more fluid. A person with command-line can create tools in Galaxy but with HuNI you'd have to do it yourself.
A: Bioinformatics folk tend to be competent with Python - but we're not sure what competencies our researchers will have, less likely to be able to develop for themselves.
Requirements for a New Zealand Humanities eResearch Infrastructure
James Smithies, University of Canterbury
Vast amounts of cultural heritage being digitised or being born online. Humanities researchers will never be engineers but need to work through the issues.
International context:
Humanities computing's been around for decades but still in its infancy. US, UK, even Aus have ongoing strategic conversations, which helps build roadmaps. NZ is quite far behind these (though have used punchcards where necessary). "Digging into Data Challenge" overseas but we're missing out because of lackk of infrastructure and lack of awareness.
Fundamentals of humanities eresearch:
HuNI provides a good model. Need a shift from thinking of sources as objects to viewing them as data. Big paradigm shift. Not all will work like this. But programmatic access will become more important.
National context:
19th century ship's logs, medical records from leper colonies. Hard to read, incomplete, possibly accurate. Have traditional methods to deal with these but problems multipy when ported into digital formats. Big problem is lack of awareness of what opportunities exist. So capabilities and infrastructure is low. Decisions often outsourced to social sciences.
At the same time, DigitalNZ, National Digital Heritage Archive, Timeframes archive, AJHR, PapersPast, etc are fantastic resources that could be leveraged if we come up with a central strategy.
Requirements:
- Need to develop training schemes
- Capability building. Lots of ideas out there but people don't know where to start. Need to look at peer review, PBRF - how to measure quality and reward it.
- International collaboration
- Requirements elicitation and definition
- Funding for all of the above including experimentation
Q: Data isn't just data, it's situated in a context. Being technology-led and using RDF is one thing. But how do we give richness to a collection?
A: Classic example would be researcher wanting access to object properly marked up and contribute to the conversation by adding scholarly comments, engage with other marginalia. Eg ancient greek text corpus (is I think describing the Perseus Digital Library). Want both a simple interface and programmatic access.
Q: Need to make explicit the value of an NZ corpus. Have some pieces but need to join up. Need to work with DigitalNZ. Once we have corpus can look at tools.
A: Yes, need to get key stakeholders around table and talk about what we need.
Capturing the flux in Scientific Knowledge
Prashant Gupta & Mark Gahegan, The University of Auckland
Everything changes - whether the physical world itself or our understanding of the world:
* new observation or data
* new understanding
* societal drivers
How can we deal with change and make our tools and systems more dynamic to deal with change?
Ontology evolution - have done lots of work on this. Researchers have updated knowledge structure and incorporated in forms of provenance or change logs. Tells us "knowledge that" eg What is the change, when it happened, who did it, to what, etc. But we still don't capture "knowledge how" or "knowledge why".
Life cycle of a category:
Processes, context, researchers' knowledge are involved in birth of a category - but these tend to be lost when the category's formed. We're left with the category's intension, extension, and place in the conceptual hierarchy. Lots of information not captured.
"We focus on products of science and ignore process of science".
Proposes connecting static categories and the process of science to get a better understanding. Could act as a fourth facet to a category's representation. Can help address interoperability problem and help track evolution of categories.
Process model:
Process of science gives birth to conceptual change modifies scientific artifacts connected as linked science improves process of science.
If change not captured, network of artifacts will become inconsistent and linked science will fail.
Proposes building a computational framework that captures and analyses changes, creating a category-versioning system.
Comment from James Smithies: would fit well in humanities context.
Comment: drawing parallel with software development changeset management.