Skip to content

Commit ded1605

Browse files
committed
Update version to 0.13.0
1 parent 59be3e6 commit ded1605

File tree

22 files changed

+55
-64
lines changed

22 files changed

+55
-64
lines changed

README.md

Lines changed: 12 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -4,10 +4,6 @@ Cortex is an open source platform for deploying machine learning models as produ
44

55
<br>
66

7-
<!-- Delete on release branches -->
8-
<!-- CORTEX_VERSION_README_MINOR -->
9-
[install](https://cortex.dev/install)[tutorial](https://cortex.dev/iris-classifier)[docs](https://cortex.dev)[examples](https://github.com/cortexlabs/cortex/tree/0.12/examples)[we're hiring](https://angel.co/cortex-labs-inc/jobs)[email us](mailto:hello@cortex.dev)[chat with us](https://gitter.im/cortexlabs/cortex)<br><br>
10-
117
<!-- Set header Cache-Control=no-cache on the S3 object metadata (see https://help.github.com/en/articles/about-anonymized-image-urls) -->
128
![Demo](https://d1zqebknpdh033.cloudfront.net/demo/gif/v0.12.gif)
139

@@ -33,7 +29,7 @@ Cortex is designed to be self-hosted on any AWS account. You can spin up a clust
3329
<!-- CORTEX_VERSION_README_MINOR -->
3430
```bash
3531
# install the CLI on your machine
36-
$ bash -c "$(curl -sS https://raw.githubusercontent.com/cortexlabs/cortex/0.12/get-cli.sh)"
32+
$ bash -c "$(curl -sS https://raw.githubusercontent.com/cortexlabs/cortex/0.13/get-cli.sh)"
3733

3834
# provision infrastructure on AWS and spin up a cluster
3935
$ cortex cluster up
@@ -70,11 +66,7 @@ class PythonPredictor:
7066
```yaml
7167
# cortex.yaml
7268

73-
- kind: deployment
74-
name: sentiment
75-
76-
- kind: api
77-
name: classifier
69+
- name: sentiment-classifier
7870
predictor:
7971
type: python
8072
path: predictor.py
@@ -90,13 +82,13 @@ class PythonPredictor:
9082
```bash
9183
$ cortex deploy
9284

93-
creating classifier (http://***.amazonaws.com/sentiment/classifier)
85+
creating sentiment-classifier
9486
```
9587

9688
### Serve real-time predictions
9789

9890
```bash
99-
$ curl http://***.amazonaws.com/sentiment/classifier \
91+
$ curl http://***.amazonaws.com/sentiment-classifier \
10092
-X POST -H "Content-Type: application/json" \
10193
-d '{"text": "the movie was amazing!"}'
10294

@@ -106,10 +98,10 @@ positive
10698
### Monitor your deployment
10799

108100
```bash
109-
$ cortex get classifier --watch
101+
$ cortex get sentiment-classifier --watch
110102

111-
status up-to-date requested last update avg inference
112-
live 1 1 8s 24ms
103+
status up-to-date requested last update avg inference 2XX
104+
live 1 1 8s 24ms 12
113105

114106
class count
115107
positive 8
@@ -133,8 +125,8 @@ The CLI sends configuration and code to the cluster every time you run `cortex d
133125
## Examples of Cortex deployments
134126

135127
<!-- CORTEX_VERSION_README_MINOR x5 -->
136-
* [Sentiment analysis](https://github.com/cortexlabs/cortex/tree/0.12/examples/tensorflow/sentiment-analyzer): deploy a BERT model for sentiment analysis.
137-
* [Image classification](https://github.com/cortexlabs/cortex/tree/0.12/examples/tensorflow/image-classifier): deploy an Inception model to classify images.
138-
* [Search completion](https://github.com/cortexlabs/cortex/tree/0.12/examples/pytorch/search-completer): deploy Facebook's RoBERTa model to complete search terms.
139-
* [Text generation](https://github.com/cortexlabs/cortex/tree/0.12/examples/pytorch/text-generator): deploy Hugging Face's DistilGPT2 model to generate text.
140-
* [Iris classification](https://github.com/cortexlabs/cortex/tree/0.12/examples/sklearn/iris-classifier): deploy a scikit-learn model to classify iris flowers.
128+
* [Sentiment analysis](https://github.com/cortexlabs/cortex/tree/0.13/examples/tensorflow/sentiment-analyzer): deploy a BERT model for sentiment analysis.
129+
* [Image classification](https://github.com/cortexlabs/cortex/tree/0.13/examples/tensorflow/image-classifier): deploy an Inception model to classify images.
130+
* [Search completion](https://github.com/cortexlabs/cortex/tree/0.13/examples/pytorch/search-completer): deploy Facebook's RoBERTa model to complete search terms.
131+
* [Text generation](https://github.com/cortexlabs/cortex/tree/0.13/examples/pytorch/text-generator): deploy Hugging Face's DistilGPT2 model to generate text.
132+
* [Iris classification](https://github.com/cortexlabs/cortex/tree/0.13/examples/sklearn/iris-classifier): deploy a scikit-learn model to classify iris flowers.

build/build-image.sh

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ set -euo pipefail
1919

2020
ROOT="$(cd "$(dirname "${BASH_SOURCE[0]}")"/.. >/dev/null && pwd)"
2121

22-
CORTEX_VERSION=master
22+
CORTEX_VERSION=0.13.0
2323

2424
dir=$1
2525
image=$2

build/cli.sh

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ set -euo pipefail
1919

2020
ROOT="$(cd "$(dirname "${BASH_SOURCE[0]}")"/.. >/dev/null && pwd)"
2121

22-
CORTEX_VERSION=master
22+
CORTEX_VERSION=0.13.0
2323

2424
arg1=${1:-""}
2525
upload="false"

build/push-image.sh

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@
1717

1818
set -euo pipefail
1919

20-
CORTEX_VERSION=master
20+
CORTEX_VERSION=0.13.0
2121

2222
image=$1
2323

docs/cluster-management/config.md

Lines changed: 20 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -43,27 +43,27 @@ instance_volume_size: 50
4343
log_group: cortex
4444

4545
# whether to use spot instances in the cluster (default: false)
46-
# see https://cortex.dev/v/master/cluster-management/spot-instances for additional details on spot configuration
46+
# see https://cortex.dev/v/0.13/cluster-management/spot-instances for additional details on spot configuration
4747
spot: false
4848

4949
# docker image paths
50-
image_python_serve: cortexlabs/python-serve:master
51-
image_python_serve_gpu: cortexlabs/python-serve-gpu:master
52-
image_tf_serve: cortexlabs/tf-serve:master
53-
image_tf_serve_gpu: cortexlabs/tf-serve-gpu:master
54-
image_tf_api: cortexlabs/tf-api:master
55-
image_onnx_serve: cortexlabs/onnx-serve:master
56-
image_onnx_serve_gpu: cortexlabs/onnx-serve-gpu:master
57-
image_operator: cortexlabs/operator:master
58-
image_manager: cortexlabs/manager:master
59-
image_downloader: cortexlabs/downloader:master
60-
image_cluster_autoscaler: cortexlabs/cluster-autoscaler:master
61-
image_metrics_server: cortexlabs/metrics-server:master
62-
image_nvidia: cortexlabs/nvidia:master
63-
image_fluentd: cortexlabs/fluentd:master
64-
image_statsd: cortexlabs/statsd:master
65-
image_istio_proxy: cortexlabs/istio-proxy:master
66-
image_istio_pilot: cortexlabs/istio-pilot:master
67-
image_istio_citadel: cortexlabs/istio-citadel:master
68-
image_istio_galley: cortexlabs/istio-galley:master
50+
image_python_serve: cortexlabs/python-serve:0.13.0
51+
image_python_serve_gpu: cortexlabs/python-serve-gpu:0.13.0
52+
image_tf_serve: cortexlabs/tf-serve:0.13.0
53+
image_tf_serve_gpu: cortexlabs/tf-serve-gpu:0.13.0
54+
image_tf_api: cortexlabs/tf-api:0.13.0
55+
image_onnx_serve: cortexlabs/onnx-serve:0.13.0
56+
image_onnx_serve_gpu: cortexlabs/onnx-serve-gpu:0.13.0
57+
image_operator: cortexlabs/operator:0.13.0
58+
image_manager: cortexlabs/manager:0.13.0
59+
image_downloader: cortexlabs/downloader:0.13.0
60+
image_cluster_autoscaler: cortexlabs/cluster-autoscaler:0.13.0
61+
image_metrics_server: cortexlabs/metrics-server:0.13.0
62+
image_nvidia: cortexlabs/nvidia:0.13.0
63+
image_fluentd: cortexlabs/fluentd:0.13.0
64+
image_statsd: cortexlabs/statsd:0.13.0
65+
image_istio_proxy: cortexlabs/istio-proxy:0.13.0
66+
image_istio_pilot: cortexlabs/istio-pilot:0.13.0
67+
image_istio_citadel: cortexlabs/istio-citadel:0.13.0
68+
image_istio_galley: cortexlabs/istio-galley:0.13.0
6969
```

docs/cluster-management/install.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ See [cluster configuration](config.md) to learn how you can customize your clust
1212
<!-- CORTEX_VERSION_MINOR -->
1313
```bash
1414
# install the CLI on your machine
15-
bash -c "$(curl -sS https://raw.githubusercontent.com/cortexlabs/cortex/master/get-cli.sh)"
15+
bash -c "$(curl -sS https://raw.githubusercontent.com/cortexlabs/cortex/0.13/get-cli.sh)"
1616

1717
# provision infrastructure on AWS and spin up a cluster
1818
cortex cluster up
@@ -26,7 +26,7 @@ Note: This will create resources in your AWS account which aren't included in th
2626

2727
```bash
2828
# clone the Cortex repository
29-
git clone -b master https://github.com/cortexlabs/cortex.git
29+
git clone -b 0.13 https://github.com/cortexlabs/cortex.git
3030

3131
# navigate to the TensorFlow iris classification example
3232
cd cortex/examples/tensorflow/iris-classifier

docs/cluster-management/update.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ cortex cluster update
2222
cortex cluster down
2323

2424
# update your CLI
25-
bash -c "$(curl -sS https://raw.githubusercontent.com/cortexlabs/cortex/master/get-cli.sh)"
25+
bash -c "$(curl -sS https://raw.githubusercontent.com/cortexlabs/cortex/0.13/get-cli.sh)"
2626

2727
# confirm version
2828
cortex version

docs/deployments/onnx.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -55,7 +55,7 @@ You can log information about each request by adding a `?debug=true` parameter t
5555
An ONNX Predictor is a Python class that describes how to serve your ONNX model to make predictions.
5656

5757
<!-- CORTEX_VERSION_MINOR -->
58-
Cortex provides an `onnx_client` and a config object to initialize your implementation of the ONNX Predictor class. The `onnx_client` is an instance of [ONNXClient](https://github.com/cortexlabs/cortex/tree/master/pkg/workloads/cortex/lib/client/onnx.py) that manages an ONNX Runtime session and helps make predictions using your model. Once your implementation of the ONNX Predictor class has been initialized, the replica is available to serve requests. Upon receiving a request, your implementation's `predict()` function is called with the JSON payload and is responsible for returning a prediction or batch of predictions. Your `predict()` function should call `onnx_client.predict()` to make an inference against your exported ONNX model. Preprocessing of the JSON payload and postprocessing of predictions can be implemented in your `predict()` function as well.
58+
Cortex provides an `onnx_client` and a config object to initialize your implementation of the ONNX Predictor class. The `onnx_client` is an instance of [ONNXClient](https://github.com/cortexlabs/cortex/tree/0.13/pkg/workloads/cortex/lib/client/onnx.py) that manages an ONNX Runtime session and helps make predictions using your model. Once your implementation of the ONNX Predictor class has been initialized, the replica is available to serve requests. Upon receiving a request, your implementation's `predict()` function is called with the JSON payload and is responsible for returning a prediction or batch of predictions. Your `predict()` function should call `onnx_client.predict()` to make an inference against your exported ONNX model. Preprocessing of the JSON payload and postprocessing of predictions can be implemented in your `predict()` function as well.
5959

6060
## Implementation
6161

docs/deployments/tensorflow.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -56,7 +56,7 @@ You can log information about each request by adding a `?debug=true` parameter t
5656
A TensorFlow Predictor is a Python class that describes how to serve your TensorFlow model to make predictions.
5757

5858
<!-- CORTEX_VERSION_MINOR -->
59-
Cortex provides a `tensorflow_client` and a config object to initialize your implementation of the TensorFlow Predictor class. The `tensorflow_client` is an instance of [TensorFlowClient](https://github.com/cortexlabs/cortex/tree/master/pkg/workloads/cortex/lib/client/tensorflow.py) that manages a connection to a TensorFlow Serving container via gRPC to make predictions using your model. Once your implementation of the TensorFlow Predictor class has been initialized, the replica is available to serve requests. Upon receiving a request, your implementation's `predict()` function is called with the JSON payload and is responsible for returning a prediction or batch of predictions. Your `predict()` function should call `tensorflow_client.predict()` to make an inference against your exported TensorFlow model. Preprocessing of the JSON payload and postprocessing of predictions can be implemented in your `predict()` function as well.
59+
Cortex provides a `tensorflow_client` and a config object to initialize your implementation of the TensorFlow Predictor class. The `tensorflow_client` is an instance of [TensorFlowClient](https://github.com/cortexlabs/cortex/tree/0.13/pkg/workloads/cortex/lib/client/tensorflow.py) that manages a connection to a TensorFlow Serving container via gRPC to make predictions using your model. Once your implementation of the TensorFlow Predictor class has been initialized, the replica is available to serve requests. Upon receiving a request, your implementation's `predict()` function is called with the JSON payload and is responsible for returning a prediction or batch of predictions. Your `predict()` function should call `tensorflow_client.predict()` to make an inference against your exported TensorFlow model. Preprocessing of the JSON payload and postprocessing of predictions can be implemented in your `predict()` function as well.
6060

6161
## Implementation
6262

docs/packaging-models/tensorflow.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
# TensorFlow
22

33
<!-- CORTEX_VERSION_MINOR -->
4-
Export your trained model and upload the export directory, or a checkpoint directory containing the export directory (which is usually the case if you used `estimator.train_and_evaluate`). An example is shown below (here is the [complete example](https://github.com/cortexlabs/cortex/blob/master/examples/tensorflow/sentiment-analyzer)):
4+
Export your trained model and upload the export directory, or a checkpoint directory containing the export directory (which is usually the case if you used `estimator.train_and_evaluate`). An example is shown below (here is the [complete example](https://github.com/cortexlabs/cortex/blob/0.13/examples/tensorflow/sentiment-analyzer)):
55

66
```python
77
import tensorflow as tf

0 commit comments

Comments
 (0)