This is a toy example code for cloud function using fashion MNIST.
You upload a fashion MNIST image, which is randomly chosen from test dataset, to Cloud storage(GCS) for inference. Pretrained model is also set in GCS. You request Cloud Functions as http requests to get the inference result. You will get the status code and inference like this in your command .
Test image[9369] is created!
tensorflow/test.png in gs://your_bucket_name is deleted.
Uploading data/test.png to gs://your_bucket_name is finished.
200
Bag
At loacal, run
poetry install
cd src
python3 train.py
Then you will get fashion_mnist_weights.data-00000-of-00001
and fashion_mnist_weights.index
.
Those files are put in Cloud Storage.
In this example, you use 2 GCP services which are Cloud Storage(GCS) and Cloud Functions. Thre is a free tier for new user for GCP. This toy program can be use within it.
In Cloud Storage bucket you need to prepare directories as follows. In tensorflow directory, please put files which are created at previous step as follows.
├── tensorflow
│ ├── fashion_mnist_weights.data-00000-of-00001
│ └── fashion_mnist_weights.index
└── tmp
Next you need to get credential key to get access to GCP bucket from local because you upload test.png
in tensorflow directory in GCS.
Open Google command line tool in GCP,
Then run this commands. Please fill your project id at {project_id}.
You don't need {}.
gcloud iam service-accounts create gcs-access
gcloud projects add-iam-policy-binding {project_id} --member=serviceAccount:gcs-access@{project_id}.iam.gserviceaccount.com --role="roles/storage.admin"
cloud iam service-accounts keys create gcs-access.json --iam-account gcs-access@{project_id}.iam.gserviceaccount.com
Then, you will get a credential key as "gcs-access.json", please copy it in your key
directory.
This is key to your GCS, so please keep it in private.
For More detail, see below.
Please copy and paste at the main.py
and requirements.py
in src/cloud-funciotns
for codes in Cloud Functions.
Setting for Cloud Functions.
Memory:2 GB
Runtime:Python3.7
Entrypoint:handler
In this code example, http request is used. Thus permission should be added.
Navigate to permission page and choose allUsers
for member section and Cloud Functions Invoker
(Cloud Functions 起動元
)for a role.
If you don't set the permission you get 403 error.
See this site(Japanese) as reference.
After deploying Cloud Functions please copy trigger whose form is http://......
to give main.py
.
This triger can be accessed publicly because of permission setting mentioned above.
So you should not open this url to public. Cloud Functions incur chaerges based on the time you trigger Cloud Functions.
python3 main.py --project_id {YOUR_PROJECT_ID} --bucket_name {YOUR_BUCKET_NAME} --trigger {YOUR_TRIGGER}
Training code, main.py and requirments.txt for Cloud Function is based on this site.