[redland-dev] RAM footprint

Mikhail Levin mlevin at svarnetics.org
Thu Mar 28 15:44:24 EDT 2013


Dear all,

I am running out of memory trying to load a large ontology into
Redland hashed in-memory store.  The ontology I am trying to load is
OpenGalen:
http://www.opengalen.org/sources/sources.html
In principle, my box (Ubuntu 64bit) has enough RAM (12GB) to load
OpenGalen (9M triples).  Initially, RAM consumption grows linearly.
After loading ~2M triples, however, I see a burst that runs me out of
memory.  Loading the files in different order produces same result.

My code is below.  Am I doing something wrong?  Is there a way to
reduce RAM footprint?

Thank you for your help

Mikhail


int main(int argc, char* argv[]) {
   librdf_world* world = librdf_new_world();
   librdf_world_open(world);
   librdf_storage* storage = librdf_new_storage(world, "hashes", NULL,
"hash-type='memory'");
   librdf_model* model=librdf_new_model(world, storage, NULL);
   librdf_parser* parser = librdf_new_parser(world, "rdfxml", NULL, NULL);

   std::vector<std::string> fnv = open_galen_file_list(argv[1]);
//this is a list of files to load

   for(std::size_t i = 0; i != fnv.size(); ++i) {
//      std::cout << librdf_storage_size(storage) << '\t' << fnv[i] <<
std::endl;
      librdf_uri* uri = librdf_new_uri_from_filename(world, fnv[i].c_str());
      librdf_parser_parse_into_model(parser, uri, NULL, model);
      librdf_free_uri(uri);
   }
   librdf_free_parser(parser);
   librdf_free_model(model);
   librdf_free_storage(storage);
   librdf_free_world(world);
   return 0;
}


More information about the redland-dev mailing list