Skip to main content

overcome Graphdef cannot be larger than 2Gb in tensorflow [Resolved]

I am using tensorflow's imageNet trained model to extract the last pooling layer's features as representation vectors for a new dataset of images.

The model as is predicts on a new image as follows:

python --image_file new_image.jpeg 

I edited the main function so that I can take a folder of images and return the prediction on all images at once and write the feature vectors in a csv file. Here is how I changed the main function

def main(_):
  #image = (FLAGS.image_file if FLAGS.image_file else
  #         os.path.join(FLAGS.model_dir, 'cropped_panda.jpg'))
  #edit to take a directory of image files instead of a one file
  if FLAGS.data_folder:
    list_of_images = os.listdir(images_folder)
    raise ValueError("Please specify image folder")

  with open("feature_data.csv", "wb") as f:
    feature_writer = csv.writer(f, delimiter='|')

    for image in list_of_images:
      current_features = run_inference_on_image(images_folder+"/"+image)

It worked just fine for around 21 images but then crashed with the following error:

  File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/framework/", line 1912, in as_graph_def
    raise ValueError("GraphDef cannot be larger than 2GB.")
ValueError: GraphDef cannot be larger than 2GB.

I thought by calling the method run_inference_on_image(images_folder+"/"+image) the previous image data would be overwritten to only consider the new image data, which doesn't seem to be the case. How to resolve this issue?

Question Credit: MedAli
Question Reference
Asked August 9, 2016
Posted Under: Programming
1 Answers

Your Answer