TensorFlow running Google Im2txt:show and tell inception V3

Source: Internet
Author: User
Tags curl gpg nltk gtx

My device: Ubuntu14.04+gpu

TensorFlow1.0.1


Related papers "Show and Tell:lessons learned from the Mscoco Image captioning Challenge"

https://arxiv.org/abs/1609.06647

Last September, just open source

Github:https://github.com/tensorflow/models/tree/master/im2txt#generating-captions


According to GitHub's Readme

Install related items First

Bazel according to the official website
$echo "Deb [arch=amd64] http://storage.googleapis.com/bazel-apt stable jdk1.8" | sudo tee/etc/apt/sources.list.d/bazel.list $curl https://bazel.build/bazel-release.pub.gpg | sudo apt-key add-$sudo apt-get update && sudo apt-get install Bazel error: Some software packages cannot be installed. If you are using the unstable release, this may be due to the system not being able to meet the status you requested. Some of the packages you need may not have been created or have been removed from the new to (Incoming) directory in this release. The following information may be helpful in resolving a problem:   The following packages have an unsatisfied dependency:  bazel: Dependent: GOOGLE-JDK But unable to install it or                   JAVA8-JDK but cannot install it or                   JAVA8-SDK but cannot install it or                   Oracle-java8-installer but cannot install it E: cannot fix error, Because you require certain packages to remain current, they undermine the dependencies between packages.   tried the countless methods on the Internet, a variety of sources are useless, until I see a line of the official website: If you want the JDK 7, please replace  jdk1.8  with  jdk1.7 &n Bsp;and If you want to install theTesting version of Bazel, replace  stable  with  testing.   should be because my system is ubuntu14.04, so with the JDK7 $ update-java-alternatives-l #java-1.7. 0-openjdk-amd64 1071/usr/lib/jvm/java-1.7. 0-openjdk-amd64
Continue to follow the official website $echo "Deb [arch=amd64] http://storage.googleapis.com/bazel-apt stable jdk1.7" | sudo tee/etc/apt/sources.list.d/bazel.list $curl https://bazel.build/bazel-release.pub.gpg | sudo apt-key add-$sudo apt-get update && sudo apt-get install Bazel $sudo apt-get upgrade Bazel
Check if you've installed $/usr/bin/bazel version

NumPy Installation Official documentation https://www.scipy.org/install.html $python-m pip install--upgrade pip $pip install--user NumPy scipy MATPL Otlib Ipython jupyter Pandas sympy nose test: $python >>>import scipy >>>import numpy >>>scipy.te St () >>>numpy.test () said online can also be so, do not know what is different from GitHub on the URL of the link $sudo apt-get install python-scipy $sudo apt-get Install Python-numpy $sudo apt-get Install Python-matplotlib
Natural Language Toolkit (NLTK): First install NLTK http://www.nltk.org/install.html $sudo pip install-u NLTK $sudo pip ins Tall-u numpy $python >>> Import NLTK
Then install NLTK data http://www.nltk.org/data.html $sudo python >>> import nltk >>> nltk.download () set directory to/usr/ Share/nltk_data
Test data installed >>> from Nltk.corpus import Brown >>> brown.words () [' The ', ' Fulton ', ' County ', ' Grand ', ' jury ', ' said ', ...]
This is done at the end of the run: >>>import nltk >>>nltk.download (' Punkt ') in case of download and preprocessing, the following issues are encountered:

Lookuperror:

**********************************************************************

Resource u ' tokenizers/punkt/english.pickle ' not found. Please

Usethe NLTK Downloader to obtain the resource: >>>

Nltk.download ()

Searched in:

-'/home/ubuntu/nltk_data '

-'/usr/share/nltk_data '

-'/usr/local/share/nltk_data '

-'/usr/lib/nltk_data '

-'/usr/local/lib/nltk_data '

-U '

**********************************************************************
Pretreatment

# location to save the Mscoco data.
Mscoco_dir= "${home}/im2txt/data/mscoco"

# Build the preprocessing script.
Bazel build Im2txt/download_and_preprocess_mscoco

# Run the preprocessing script.
Bazel-bin/im2txt/download_and_preprocess_mscoco "${mscoco_dir}"
This step is relatively simple, but the network is not good words will often the honey juice, I have been down several times, each time is particularly long, in short, is really a crash

Seeing this means that the pretreatment is successful.
Training
$ mscoco_dir= "/path/to/mscoco" $ inception_checkpoint= "/path/to/inception_v3.ckpt" $ model_dir= "/path/to/models/ Im2txt/model "$ Bazel build-c opt im2txt/... $ bazel-bin/im2txt/train \ >--input_file_pattern=" ${mscoco_dir}/train-? ???? -of-00256 "\ >--inception_checkpoint_file=" ${inception_checkpoint} "\ >--train_dir=" ${model_dir}/train "\ > --train_inception=false \ >--number_of_steps=1000000
Error: Attributeerror: ' Module ' object has no attribute ' _base '
Workaround: $ pip Install--upgrade html5lib==1.0b8
In the post-Solution Model training:

The next step is to wait, and the internet says it takes a week or so to train for three or four days.
The next step is fine tuning, before you test the effect of training.
A small partner who wants to be lazy can also skip the training steps and use the model I trained directly

Using the existing model directly with the model I trained
Https://github.com/withyou1771/im2txt

$CHECKPOINT _path= "/path/to/model.ckpt-1000000" $VOCAB _file= "/path/to/word_counts.txt" $IMAGE _file= "/path/to/ Models/im2txt/1.jpg "$bazel build-c opt im2txt/run_inference $bazel-bin/im2txt/run_inference \

--checkpoint_path=${checkpoint_path} \

--vocab_file=${vocab_file} \

--input_files=${image_file}
$ checkpoint_path= "/path/to/model.ckpt-1000000" $ image_file= "/path/to/1.jpg" $ vocab_file= "/path/to/word_ Counts.txt "$ Bazel build-c opt im2txt/run_inference info:found 1 target ... Target//im2txt:run_inference up-to-date:bazel-bin/im2txt/run_inference info:elapsed time:0.138s, Critical Path: 0.00s (TensorFlow) ubuntu@ubuntu-all-series:/home/data1/tf/models/im2txt$ bazel-bin/im2txt/run_inference-- Checkpoint_path=${checkpoint_path}--vocab_file=${vocab_file}--input_files=${image_file} I TENSORFLOW/STREAM_ EXECUTOR/DSO_LOADER.CC:135] successfully opened CUDA Library libcublas.so.8.0 locally I TENSORFLOW/STREAM_EXECUTOR/DSO _LOADER.CC:135] successfully opened CUDA Library libcudnn.so.5 locally I tensorflow/stream_executor/dso_loader.cc:135] Successfully opened CUDA Library libcufft.so.8.0 locally I tensorflow/stream_executor/dso_loader.cc:135] successfully Opened Cuda Library libcuda.so.1 locally I tensorflow/stream_executor/dso_loader.cc:135] successfully opened CUDA Library Libcurand.so.8.0 locally INFO:tensorflow:Building model. INFO:tensorflow:Initializing Vocabulary from File:/word_counts.txt INFO:tensorflow:Created vocabulary with 11520 words INFO:tensorflow:Running caption Generation on 1 files matching/1.jpg W tensorflow/core/platform/cpu_feature_guard.cc : TensorFlow Library wasn ' t compiled to use SSE3 instructions, but these is available on your machine and could SP Eed up CPU computations. W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn ' t compiled to use SSE4.1 instructions, BU T these is available on your machine and could speed up CPU computations. W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn ' t compiled to use SSE4.2 instructions, BU T these is available on your machine and could speed up CPU computations. W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn ' t compiled to use AVX instructions, but th ESE is available on your machine and could speeD up CPU computations. W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn ' t compiled to use AVX2 instructions, but t Hese is available on your machine and could speed up CPU computations. W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn ' t compiled to use FMA instructions, but th ESE is available on your machine and could speed up CPU computations. I tensorflow/core/common_runtime/gpu/gpu_device.cc:885] Found device 0 with Properties:name:GeForce GTX major:6 mi Nor:1 memoryclockrate (GHz) 1.7335 pcibusid 0000:0a:00.0 total Memory:7.92gib free Memory:7.81gib W tensorflow/stream_e XECUTOR/CUDA/CUDA_DRIVER.CC:590] Creating context when one is currently active; existing:0x541a970 I tensorflow/core/common_runtime/gpu/gpu_device.cc:885] Found device 1 with Properties:name: GeForce GTX major:6 minor:1 memoryclockrate (GHz) 1.7335 pcibusid 0000:09:00.0 Total Memory:7.92gib free Memory:7 .81GiB W Tensorflow/stream_exeCUTOR/CUDA/CUDA_DRIVER.CC:590] Creating context when one is currently active; Existing:0x541e2f0 I tensorflow/core/common_runtime/gpu/gpu_device.cc:885] Found device 2 with Properties:name: GeForce GTX major:6 minor:1 memoryclockrate (GHz) 1.7335 pcibusid 0000:06:00.0 Total Memory:7.92gib free Memory:7 .81GiB W tensorflow/stream_executor/cuda/cuda_driver.cc:590] Creating context when one is currently active; Existing:0x5421c70 I tensorflow/core/common_runtime/gpu/gpu_device.cc:885] Found device 3 with Properties:name: GeForce GTX major:6 minor:1 memoryclockrate (GHz) 1.7335 pcibusid 0000:05:00.0 Total Memory:7.92gib free Memory:7 .57GiB I tensorflow/core/common_runtime/gpu/gpu_device.cc:906] dma:0 1 2 3 I tensorflow/core/common_runtime/gpu/gpu_ device.cc:916] 0:y y y y i tensorflow/core/common_runtime/gpu/gpu_device.cc:916] 1:y y y y i Tensorflow/core/common_runt ime/gpu/gpu_device.cc:916] 2:y y y y I tensorflow/core/common_runtime/gpu/gpu_device.cc:916] 3:y y yY I tensorflow/core/common_runtime/gpu/gpu_device.cc:975] Creating tensorflow device (/gpu:0), device:0, NAME:GEF Orce GTX, PCI bus id:0000:0a:00.0) I tensorflow/core/common_runtime/gpu/gpu_device.cc:975] Creating TensorFlow Devi CE (/gpu:1)-(device:1, Name:geforce GTX, PCI bus id:0000:09:00.0) I tensorflow/core/common_runtime/gpu/gpu_d evice.cc:975] Creating tensorflow device (/gpu:2) (Device:2, Name:geforce GTX, PCI bus id:0000:06:00.0) I te nsorflow/core/common_runtime/gpu/gpu_device.cc:975] Creating tensorflow device (/gpu:3), Device:3, Name:geforce GTX, PCI bus id:0000:05:00.0)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.