I now managed, (using ARC2 and SMWWriter, in a MediaWiki extension) to populate Semantic MediaWiki pages with triples, from a snippet of RDF/XML (Thanks to Egon Willighagen for the RDF-ized NMRShiftDB data, submitted further below), yay! But ... as you can see, using the full URI:s as wiki names is not a good idea. URI:s as wiki titles is ugly ... and the predicates in this case even got truncated and all treated as the same one, since they had hashes in it, which MediaWiki obviously doesn't allow in titles), so that's the whole background for our talking so much about the "equivalent URI handler" (mentioned here for example), which is thought to be a configurable handler of mappings from URI patterns to (sensible) wiki titles. Also, optimally the same pattern can then be used both on import and export so that the format is kept, allowing to use SMW as a (collaborative) RDF editor! (which is one of the main motivations for my GSoC project).
Well, the hard bits (URI -> Wiki title mapping) remains. Diving in to it now ...
This is my current mental picture of the architecture of the parts included in the RDF import/export functionality I'm implementing for Semantic MediaWiki as part of my Google summer of code project. I just got the ARC2 based store functional. The functionality still to be implemented in "dashed" lines:
- - - - - - - - - - | Export | | Import | - - - - - - - - - - ^ | | v - - - - - - - - - - --------------- | Equiv URI handler |->| SMW Writer | - - - - - - - - - - --------------- ^ | | v --------------------- --------------- | SPARQL+ Interface | | SMW | --------------------- --------------- ^ ______/ | | v v ----------------- ---------------- | ARC2 Store | | MediaWiki DB | ----------------- ----------------
Now I have a working RDF Store connector for Semantic MediaWiki, that uses ARC2:s RDF store, rather than SMW:s built-in store. This will allow to take advantage of functionality in ARC2, such as possibility to set up a SPARQL endpoint etc.
The ARC2 connector implements the same amount of the SMWStore API as the JosekiStore, but I'm not yet sure if more needs to be implemented, for the things we want to do (general RDF import/export). Gotta figure that out.
Feel free to try it out, but be warned that it has been only very briefly much tested at all yet!
Back on track GSoC:ing. Follow progress at my twitter.
I presented my MSc thesis project "SWI-Prolog as a Semantic Web tool for semantic querying in Bioclipse" today. (Report for download here
Find the slides below. I expected a very non-informatics audience (though some of my fellow Bioclipse:rs showed up =) ), had only 20 minutes, and lots of non-common-knowledge things to introduce, so these are really mostly a bunch of pictures for talking through the basics of semantic web, prolog and Bioclipse.
Turning back to the GSoC project now!
Have started actual coding for GSoC this wednesday (start was 2 weeks delayed because of exams, which I'll catch up). Still just getting up to speed, but looking now into the PHP RDF framework ARC, whose RDF store will replace the currently used RAP store in Semantic Mediawiki. Usage ARC itself looks very straightforward. Just have to figure out the SMW Store API. Looking at the SMWRapStore2.php now, to get an idea.
If you want to follow my progress in (approximate) real-time, then see my twitter.
My degree project, titled "SWI-Prolog as a Semantic Web tool for semantic querying in Bioclipse" is getting closer to finish. Now my report is approved by the Scientific Reviewer (thanks, Prof. Mats Gustafsson), so I wanted to make it available here (Download PDF). Reports on typos are welcome of course! :)
Coding for my GSoC project will start for real in June, 9th or so, but I just had a first look at code, to start wrapping my head around the things involved. I installed the following on my local SMW:
Some questions that arose (for Denny in the first place, I guess, but feel free to comment):
With some copy and paste of code from this page, I quickly had a MediaWiki Special Page set up, where I could make use SMWWriters internal API to implement a crude form for adding or removing "triples" in my Semantic MediaWiki. See Screenshot:
And the result, on the Methane page:
Looks promising. Connecting this with some ARC functionality for parsing SPARQL and RDF/XML, should make a big step in the right direction.