Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add EI documentation #733

Merged
merged 2 commits into from
Mar 9, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions docs/mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -84,6 +84,7 @@ nav:
- Overview: 'mxnet/README.md'
- Import Gluon Model: 'docs/mxnet/how_to_convert_your_model_to_symbol.md'
- Load a MXNet Model: 'jupyter/load_mxnet_model.ipynb'
- Backend Optimizer for MXNet: 'docs/mxnet/mxnet_backend_optimizer.md'
- Modules:
- MXNet Engine: 'mxnet/mxnet-engine/README.md'
- MXNet Model Zoo: 'mxnet/mxnet-model-zoo/README.md'
Expand Down
33 changes: 33 additions & 0 deletions docs/mxnet/mxnet_backend_optimizer.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
# Custom backend optimizer support on Apache MXNet

Apache MXNet currently implemented a method that allowing third-party
backend optimizer to accelerate the inference result. DJL currently
also exposed this functionality through the `MxOptimizeFor` option
of the Criteria.

```
.optOption("MxOptimizeFor", "optimizer_name")
```

After a name is passed, DJL will try to find the party library from
the environment variable called `MXNET_EXTRA_LIBRARY_PATH`. Users are required to
set this environment variable to locate the library. After that, you should see the messages from the inference to see if the library is enabled.

Here is a list of supporting backend optimizers:

## AWS [Elastic Inference Accelerator](https://docs.aws.amazon.com/elastic-inference/latest/developerguide/what-is-ei.html) (EIA)

Currently, you can use EIA library for DJL on all EI enabled instance.

You can follow the instruction to start your EI application with DJL:

```
> https://docs.aws.amazon.com/elastic-inference/latest/developerguide/ei-mxnet.html
```

Currently, the EI logging is disabled. For debugging purpose, you can enable that through
setting the `MXNET_EXTRA_LIBRARY_VERBOSE` environment variable:

```
export MXNET_EXTRA_LIBRARY_VERBOSE=true
```
Original file line number Diff line number Diff line change
Expand Up @@ -57,9 +57,10 @@ static Engine newInstance() {

// load extra MXNet library
String paths = System.getenv("MXNET_EXTRA_LIBRARY_PATH");
boolean extraLibVerbose =
System.getenv().containsKey(MXNET_EXTRA_LIBRARY_VERBOSE)
&& System.getenv(MXNET_EXTRA_LIBRARY_VERBOSE).equals("true");
boolean extraLibVerbose = false;
if (System.getenv().containsKey(MXNET_EXTRA_LIBRARY_VERBOSE)) {
extraLibVerbose = Boolean.parseBoolean(System.getenv(MXNET_EXTRA_LIBRARY_VERBOSE));
}
if (paths != null) {
String[] files = paths.split(",");
for (String file : files) {
Expand Down