You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+44-31Lines changed: 44 additions & 31 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -24,7 +24,7 @@
24
24
* D = (X,y) is a dataset derived from f,
25
25
* x* is the true optimum of f in O (minimum or maximum).
26
26
27
-
The testing framework feeds your algorithm constraints and data (O,D) and collects its predicted optimum. The algorithm's predicted optimal value can then be compared to the true optimal value f(x*). By comparing the two over multiple randomly generated optimization problems, `doframework` produces a **prediction profile** for your algorithm.
27
+
`doframework` feeds your algorithm constraints and data (O,D) and collects its predicted optimum. The algorithm's predicted optimal value can then be compared to the true optimal value f(x*). By comparing the two over multiple randomly generated optimization problems, `doframework` produces a **prediction profile** for your algorithm.
28
28
29
29
`doframework` integrates with your algorithm (written in Python).
30
30
@@ -40,7 +40,7 @@ The testing framework feeds your algorithm constraints and data (O,D) and collec
40
40
41
41
`doframework` can run either locally or remotely. For optimal performance, run it on a Kubernetes cluster. Cloud configuration is currently available for AWS and IBM Cloud [OpenShift](https://docs.openshift.com/"RedHat OpenShift Documentation") clusters.
42
42
43
-
The framework relies on Cloud Object Storage (COS) to interact with simulation products. Configuration is currently available for [AWS](https://aws.amazon.com/s3/"AWS S3") or [IBM COS](https://www.ibm.com/cloud/object-storage"IBM Cloud Object Storage").
43
+
The framework uses storage (local or S3) to interact with simulation products. Configuration is currently available for [AWS](https://aws.amazon.com/s3/"AWS S3") or [IBM Cloud Object Storage COS](https://www.ibm.com/cloud/object-storage"IBM Cloud Object Storage").
44
44
45
45
# Install
46
46
@@ -52,14 +52,23 @@ $ pip install doframework
52
52
53
53
# Configs
54
54
55
-
COS specifications are provided in a `configs.yaml`.
55
+
Storage specifications are provided in a `configs.yaml`. You'll find examples under `./configs/*`.
56
56
57
-
The `configs.yaml` includes the list of source and target bucket names (under `s3:buckets`). Credentials are added under designated fields.
58
-
59
-
Currently, two cloud service providers are available under `s3:cloud_service_provider`: `aws` and `ibm`.
60
-
61
-
`s3:endpoint_url` is optional for AWS.
57
+
The `configs.yaml` includes the list of source and target bucket names (under `buckets`). If necessary, S3 credentials are added under designated fields.
62
58
59
+
Here is the format of the `configs.yaml` either for local storage
60
+
```
61
+
local:
62
+
buckets:
63
+
inputs: '<inputs-folder>'
64
+
inputs_dest: '<inputs-dest-folder>'
65
+
objectives: '<objectives-folder>'
66
+
objectives_dest: '<objectives-dest-folder>'
67
+
data: '<data-folder>'
68
+
data_dest: '<data-dest-folder>'
69
+
solutions: '<solutions-folder>'
70
+
```
71
+
or S3
63
72
```
64
73
s3:
65
74
buckets:
@@ -75,19 +84,20 @@ s3:
75
84
endpoint_url: 'https://xxx.xxx.xxx'
76
85
region: 'xx-xxxx'
77
86
cloud_service_provider: 'aws'
78
-
79
87
```
80
-
**Bucket names above must be distinct**.
88
+
Currently, two S3 providers are available under `s3:cloud_service_provider`: either `aws` or `ibm`. The `endpoint_url` is _optional_ for AWS.
89
+
90
+
**Bucket / folder names must be distinct**.
81
91
82
92
# Inputs
83
93
84
94
`input.json` files provide the necessary metadata for the random genration of optimization problems.
85
95
86
-
`doframework` will run end to end, once `input.json` files are uploaded to `<inputs_bucket>`.
96
+
`doframework` will run end to end, once `input.json` files are uploaded to `<inputs-bucket>` / `<inputs-folder>`.
87
97
88
-
The jupyter notebook `./notebooks/inputs.ipynb` allows you to automatically generate input files and upload them to `<inputs_bucket>`.
98
+
The jupyter notebook `./notebooks/inputs.ipynb` allows you to automatically generate input files and upload them to `<inputs-bucket>`.
89
99
90
-
Here is an example of an input file (see input samples `input_basic.json`and `input_all.json`under `./inputs`).
100
+
Here is an example of an input file (see input samples `input_basic.json` under `./inputs`).
91
101
92
102
93
103
```
@@ -102,24 +112,22 @@ Here is an example of an input file (see input samples `input_basic.json` and `i
102
112
},
103
113
},
104
114
"omega" : {
105
-
"ratio": 0.8,
106
-
"scale": 0.01
115
+
"ratio": 0.8
107
116
},
108
117
"data" : {
109
118
"N": 750,
110
119
"noise": 0.01,
111
120
"policy_num": 2,
112
121
"scale": 0.4
113
122
},
114
-
"input_file_name": "input.json"
123
+
"input_file_name": "input_basic.json"
115
124
}
116
125
```
117
126
118
127
`f:vertices:num`: number of vertices in the piece-wise linear graph of f.<br>
119
-
`f:vertices:range`: f domain will be inside this box range.<br>
128
+
`f:vertices:range`: f domain will be inside this range.<br>
120
129
`f:values:range`: range of f values.<br>
121
130
`omega:ratio`: vol(O) / vol(dom(f)) >= ratio.<br>
122
-
`omega:scale`: scale of jitter when sampling feasibility regions (as a ratio of domain diameter).<br>
123
131
`data:N`: number of data points to sample.<br>
124
132
`data:noise`: response variable noise.<br>
125
133
`data:policy_num`: number of centers in Gaussian mix distribution of data.<br>
@@ -129,11 +137,11 @@ It's a good idea to start experimenting on low-dimensional problems.
129
137
130
138
# User App Integration
131
139
132
-
Your algorithm will be integrated together with`doframework` once it is decorated with `doframework.resolve`.
140
+
Your algorithm will be integrated into`doframework` once it is decorated with `doframework.resolve`.
133
141
134
-
A `doframework` experiment runs with `doframework.run()`. The `run()` utility accepts the decorated model and a path to the `configs.yaml`.
142
+
A `doframework` experiment runs with `doframework.run()`. The `run()` utility accepts the decorated model and an absolute path to the `configs.yaml`.
135
143
136
-
Here is an example user application `module.py`.
144
+
Here is an example a user application `module.py`.
The testing framework supports the following inputs to your algorithm:
159
+
`doframework` provides the following inputs to your algorithm:
152
160
153
161
`data`: 2D np.array with features X = data[ : , :-1] and response variable y = data[ : ,-1].<br>
154
162
`constraints`: linear constraints as a 2D numpy array A. A data point x satisfies the constraints when A[ : , :-1]*x + A[ : ,-1] <= 0.<br>
163
+
164
+
It feeds your algorithm additional inputs in kwargs:
165
+
155
166
`lower_bound`: lower bound per feature variable.<br>
156
167
`upper_bound`: upper bound per feature variable.<br>
157
168
`init_value`: optional initial value.<br>
@@ -160,26 +171,28 @@ The `run()` utility accepts the arguments:
160
171
161
172
`objectives`: number of objective targets to generate per input file.<br>
162
173
`datasets`: number of datasets to generate per objective target.<br>
163
-
`feasibility_regions`: number of feasibility regions to generate per objective and dataset.<br>
164
174
`distribute`: True to run distributively, False to run sequentially.<br>
165
-
`logger`: True to see logs, False otherwise.<br>
175
+
`logger`: True to see `doframework`logs, False otherwise.<br>
166
176
`after_idle_for`: stop running when event stream is idle after this many seconds.<br>
177
+
`alg_num_cpus`: number of CPUs to dedicate to your algorithm on each optimization task.<br>
178
+
`data_num_cpus`: number of CPUs to dedicate to data generation (useful in high dimensions).
179
+
167
180
168
181
# Algorithm Prediction Profile
169
182
170
-
Once you are done running a `doframework` experiment, run the notebook `notebooks/profile.ipynb`. It will fetch the relevant experiment products from the target COS buckets and produce the algorithm's prediction profile and prediction probabilities.
183
+
Once you are done running a `doframework` experiment, run the notebook `notebooks/profile.ipynb`. It will fetch the relevant experiment products from the target buckets and produce the algorithm's prediction profile and prediction probabilities.
171
184
172
-
`doframework` produces three types of experiment products files:
185
+
`doframework` produces three types of experiment product files:
173
186
174
187
*`objective.json`: containing information on (f,O,x*)
175
188
*`data.csv`: containing the dataset the algorithm accepts as input
176
189
*`solution.json`: containing the algorithm's predicted optimum
177
190
178
-
See sample files under `./outputs`/
191
+
See sample files under `./outputs`.
179
192
180
193
# Kubernetes Cluster
181
194
182
-
To run `doframework` on a K8S cluster, make sure you are on the cluster's local `kubectl` context. Log into your cluster, if necessary (applicable to OpenShift, see doc).
195
+
To run `doframework` on a K8S cluster, make sure you are on the cluster's local `kubectl` context. Log into your cluster, if necessary (applicable to OpenShift, see `./doc/openshift.md`).
183
196
184
197
You can check your local `kubectl` context and change it if necessary with
Now `cd` into your project's folder and run the setup bash script `doframework-setup.sh`. The setup script will generate the cluster configuration file `doframework.yaml` in your project's folder. The setup script requires the absolute path to your `configs.yaml`. Otherwise, it assumes a file `configs.yaml` is located under your project's folder. Running the setup script will establish the `ray` cluster.
205
+
Now `cd` into your project's folder and run the setup bash script `doframework-setup.sh`. The setup script will generate the cluster configuration file `doframework.yaml` in your project's folder. The setup script requires the absolute path to your `configs.yaml`. Running the setup `.sh` script will establish the `ray` cluster.
193
206
194
207
```
195
208
$ cd <user_project_folder>
@@ -220,7 +233,7 @@ $ ray submit doframework.yaml module.py
220
233
221
234
# Ray Cluster
222
235
223
-
To observe the `ray` dashboard, connect to `http://localhost:8265` in your browser. See the OpenShift doc for OpenShift-specific instructions.
236
+
To observe the `ray` dashboard, connect to `http://localhost:8265` in your browser. See `./doc/openshift.md` for OpenShift-specific instructions.
# Licensed under the Apache License, Version 2.0 (the "License");
5
+
# you may not use this file except in compliance with the License.
6
+
# You may obtain a copy of the License at
7
+
#
8
+
# http://www.apache.org/licenses/LICENSE-2.0
9
+
#
10
+
# Unless required by applicable law or agreed to in writing, software
11
+
# distributed under the License is distributed on an "AS IS" BASIS,
12
+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13
+
# See the License for the specific language governing permissions and
14
+
# limitations under the License.
15
+
-->
16
+
17
+
# AAAI 2023 OCL Lab Instructions
18
+
19
+
Here are the installation instructions for participants of the OCL Lab.
20
+
21
+
## `doframework` Installation
22
+
23
+
We recommend installing `doframework` on a designated Python 3.8.0 environment. `doframework` has many dependancies that may override package versions in your current Python environment.
24
+
25
+
For example, if you're using `pyenv` in combination with `virtualenv` as your Python environment manager, you can type the following in your terminal
26
+
```
27
+
$ pyenv virtualenv 3.8.0 dof
28
+
$ pyenv local dof
29
+
```
30
+
[Here](https://realpython.com/intro-to-pyenv/#virtual-environments-and-pyenv"pyenv and virtualenv") is a good source on `pyenv` and `virtualenv` by Logan Jones.
31
+
32
+
Now that you've set up a dedicated Python environment, simply install
33
+
```
34
+
$ pip install doframework
35
+
```
36
+
Run a simple sanity check with
37
+
```
38
+
$ python
39
+
>>> import doframework
40
+
>>> exit()
41
+
```
42
+
The import command may take a while. Once it's finished (successfully, hopefully) you can exit.
43
+
44
+
## `doframework` Clonning
45
+
46
+
We will be running `doframework` Jupyter Notebooks as well as using other `doframework` material. Therefore, we'll clone a local copy of `doframework`. From your terminal, run
To launch the OCL lab Jupyter Notebooks, we'll need to add `jupyter` to our new Python environment
52
+
```
53
+
$ pip install jupyter
54
+
```
55
+
Note that `jupyter` does not come with `doframework`. We want to keep `doframework` light for cloud distribution. Once we're done installing `jupyter`, let's launch the OCL Lab notebooks
0 commit comments