Home Wiki Blog Forum GEXF.net

Gephi forums

Community support

Visualizing Wikipedia - Memory Problem

Computer requirements and configuration on Windows, Mac OS X and Linux

Visualizing Wikipedia - Memory Problem

Postby b0n12k » 01 Sep 2013 10:26

Hey All,

I am desperately trying to visualize the dataset of wikipedia articles as nodes and their linked articles as links ( Here is a preview: http://downloads.dbpedia.org/preview.ph ... _en.nt.bz2 and here is the n-triples dataset http://downloads.dbpedia.org/3.8/en/pag ... _en.nt.bz2 ).
Since I couldn't find a way to load the .nt file directly into gephi I used graphipedia (https://github.com/mirkonasato/graphipedia) for creating a neo4j database with it and then loading this database with the neo4j plugin into gephi!
I tuned my computer to 16 GB of RAM on a 64 bit ubuntu, but it always freezes after some time and doesnt continue to load more than roughly 8M nodes. I can see that gephi uses my complete RAM for that, but there is no "running out of memory" error that I usually get when I restrict gephi to less than 16 GB of RAM.

Since that didn't work I tried my luck with the SemanticWebImport Plugin (http://answers.semanticweb.com/question ... -wikipedia).

It's really strange since I think this shouldn't be a big problem. What do you think I should do? Is there actually an elegant way to handle that, or do I just need even more memory?

Greetings,
b0n12k
b0n12k
 
Posts: 1
Joined: 01 Sep 2013 10:06

Return to Hardware & Operating Systems

Who is online

Users browsing this forum: No registered users and 1 guest