How to Execute Local Applications with the Cloud SDK
It may seem strange to use the Cloud SDK to launch jobs locally. But the ML Engine is neither simple nor free, so you should test your applications locally before deploying them to the cloud. Another reason to execute your code locally is that you can view printed text on the command line instead of having to download and read logs.
You can launch a job on your development system by entering one of the following commands:
gcloud ml-engine local train: run a training job locally
gcloud ml-engine local predict: run a prediction job locally
These commands accomplish different results and accept different configuration flags.
Running a local training job
A GCP training job executes a Python package and produces output in the directory specified by the
--job-dir flag. This table lists
--job-dir and other flags you can set for local training jobs.
Flags for Local Training
||Identifies the module to execute|
||Path to the Python package containing the module to execute|
||Path to store training outputs|
||Runs code in distributed mode|
||Number of parameter servers to run|
||Start of the range of ports reserved by the local cluster|
||Number of workers to run|
--package-path flag identifies the top-level directory of your package. This is the directory that contains your package’s
setup.py file. The
--module-name flag identifies the module to execute inside the package.
If you’d like to try this for yourself, copy the
mnist_test.tfrecords files from the ch12 directory to the
ch13 directory. Then go to the
ch13/cloud_mnist directory and enter the following command:
gcloud ml-engine local train --module-name trainer.task--package-path trainer --job-dir output ----data_dir ../images
In this command,
--package-path indicates that the trainer directory represents a package, and
--module-name indicates that the name of the package’s module is
--job-dir flag tells the application to store its results in a directory named output.
Two dashes (
--data_dir. This indicates that
--data_dir and any following flags are defined by the user.
Running a local prediction job
After training is complete, you can launch a local prediction job by executing
gcloud ml-engine local predict. The table lists the different flags you can set.
Flags for Local Prediction
||Path of the model|
||Path to a local file containing prediction data in JSON format|
||Path to a local file containing prediction data in plain text|
You should assign the
--model-dir flag to the directory that contains the output of the training operation. Also, you need to identify prediction parameters using the –