@@ -5,16 +5,16 @@ Online Inference
5
5
6
6
Compiling TensorFlow or PyTorch runtimes into each existing simulation is
7
7
difficult. Maintaining that type of integration with the rapidly growing and changing
8
- APIs of libaries like TensorFlow and PyTorch is even more difficult.
8
+ APIs of libraries like TensorFlow and PyTorch is even more difficult.
9
9
10
10
Instead of forcing dependencies on the simulation code, SmartSim itself maintains those dependencies
11
11
and provides them in the ``Orchestrator `` database through RedisAI.
12
12
13
- Because of this, Simulations in Fortran, C, C++ and Python can call into PyTorch, TensorFlow,
13
+ Because of this, simulations in Fortran, C, C++ and Python can call into PyTorch, TensorFlow,
14
14
and any library that supports the ONNX format without having to compile in those libraries.
15
15
16
16
Therefore we define *Online Inference * as the execution of machine learning models via
17
- requests to an application (Orchestrator) running seperately from the client program (Simulation)
17
+ requests to an application (Orchestrator) running separately from the client program (Simulation)
18
18
without exchanging data over the filesystem.
19
19
20
20
@@ -51,10 +51,10 @@ script that uses the SmartRedis Python client to perform innovations of the ML r
51
51
The above script will first launch the database, and then the script
52
52
containing the SmartRedis client code Python script. The code here could
53
53
easily be adapted to launch a C, C++, or Fortran application containing
54
- the SmartRedis clients in those langugages as well.
54
+ the SmartRedis clients in those languages as well.
55
55
56
56
Below are a few examples of scripts that could be used with the above
57
- code to perform online inference in with various ML backends supported
57
+ code to perform online inference with various ML backends supported
58
58
by SmartSim.
59
59
60
60
@@ -71,7 +71,7 @@ by SmartSim.
71
71
.. _trace : https://pytorch.org/docs/stable/generated/torch.jit.trace.html#torch.jit.trace
72
72
73
73
The Orchestrator supports both `PyTorch `_ models and `TorchScript `_ functions and scripts
74
- in `PyTorch `_ 1.7.1. To use onnx in SmartSim, specify
74
+ in `PyTorch `_ 1.7.1. To use ONNX in SmartSim, specify
75
75
``TORCH `` as the argument for *backend * in the call to ``client.set_model `` or
76
76
``client.set_model_from_file ``.
77
77
@@ -135,7 +135,7 @@ Torch documentation for `trace`_.
135
135
torch.jit.save(module, model_buffer)
136
136
return model_buffer.getvalue()
137
137
138
- Lastly, we use the SmartRedis Python client to connect to
138
+ Lastly, we use the SmartRedis Python client to
139
139
1. Connect to the database
140
140
2. Put a batch of 20 tensors into the database (``put_tensor ``)
141
141
3. Set the Torch model in the database (``set_model ``)
@@ -185,7 +185,7 @@ SmartSim include a utility to freeze the graph of a TensorFlow or Keras model in
185
185
``TF `` as the argument for *backend * in the call to ``client.set_model `` or
186
186
``client.set_model_from_file ``.
187
187
188
- The example below shows how to use the utility to freeze an mnist model created in
188
+ The example below shows how to use the utility to freeze an MNIST model created in
189
189
Keras. This script can be used with the :ref: `SmartSim code <infrastructure_code >`
190
190
above to launch an inference session with a TensorFlow or Keras model.
191
191
@@ -221,9 +221,9 @@ method ``client.set_model_from_file`` can load it into the database.
221
221
222
222
Note that TensorFlow and Keras, unlike the other ML libraries supported by
223
223
SmartSim, requires an ``input `` and ``output `` argument in the call to
224
- ``set_model ``. These arguments correspond the the layer names of the
224
+ ``set_model ``. These arguments correspond to the layer names of the
225
225
created model. The ``smartsim.tf.freeze_model `` utility returns these
226
- values for convienence as shown below.
226
+ values for convenience as shown below.
227
227
228
228
.. code-block :: python
229
229
@@ -287,7 +287,7 @@ models to ONNX.
287
287
And PyTorch has it's own converter.
288
288
289
289
Below are some examples of a few models in `Scikit-learn `_ that are converted
290
- into onnx format for use with SmartSim. To use onnx in SmartSim, specify
290
+ into ONNX format for use with SmartSim. To use ONNX in SmartSim, specify
291
291
``ONNX `` as the argument for *backend * in the call to ``client.set_model `` or
292
292
``client.set_model_from_file ``.
293
293
@@ -327,7 +327,7 @@ with two ``outputs``.
327
327
Random Forest
328
328
-------------
329
329
330
- The Random Forest example uses the Iris datset from Scikit Learn to train a
330
+ The Random Forest example uses the Iris dataset from Scikit Learn to train a
331
331
RandomForestRegressor. As with the other examples, the skl2onnx function
332
332
`skl2onnx.to_onnx `_ is used to convert the model to ONNX format.
333
333
@@ -348,4 +348,3 @@ RandomForestRegressor. As with the other examples, the skl2onnx function
348
348
client.set_model(" rf_regressor" , model, " ONNX" , device = " CPU" )
349
349
client.run_model(" rf_regressor" , inputs = " input" , outputs = " output" )
350
350
print (client.get_tensor(" output" ))
351
-
0 commit comments