Need help to setup Rasa NLU server with docker
I went through various documentation to setup Rasa NLU on my ubuntu server. And they have a docker container which has to be run
docker run -p 5000:5000 rasa/rasa_nlu:latest-full
So I setup a model and few training data and restarted docker instance. And it is not able to find my model when I go to /status
in the url and also it returns project not found
in the response . I believe I need to setup up project path and models path when running the docker container. But I am not sure how to do it.
I am new to docker as well as Rasa NLU. If someone can point me out to right direction, it would be of great help!
docker machine-learning rasa-nlu rasa-core
add a comment |
I went through various documentation to setup Rasa NLU on my ubuntu server. And they have a docker container which has to be run
docker run -p 5000:5000 rasa/rasa_nlu:latest-full
So I setup a model and few training data and restarted docker instance. And it is not able to find my model when I go to /status
in the url and also it returns project not found
in the response . I believe I need to setup up project path and models path when running the docker container. But I am not sure how to do it.
I am new to docker as well as Rasa NLU. If someone can point me out to right direction, it would be of great help!
docker machine-learning rasa-nlu rasa-core
add a comment |
I went through various documentation to setup Rasa NLU on my ubuntu server. And they have a docker container which has to be run
docker run -p 5000:5000 rasa/rasa_nlu:latest-full
So I setup a model and few training data and restarted docker instance. And it is not able to find my model when I go to /status
in the url and also it returns project not found
in the response . I believe I need to setup up project path and models path when running the docker container. But I am not sure how to do it.
I am new to docker as well as Rasa NLU. If someone can point me out to right direction, it would be of great help!
docker machine-learning rasa-nlu rasa-core
I went through various documentation to setup Rasa NLU on my ubuntu server. And they have a docker container which has to be run
docker run -p 5000:5000 rasa/rasa_nlu:latest-full
So I setup a model and few training data and restarted docker instance. And it is not able to find my model when I go to /status
in the url and also it returns project not found
in the response . I believe I need to setup up project path and models path when running the docker container. But I am not sure how to do it.
I am new to docker as well as Rasa NLU. If someone can point me out to right direction, it would be of great help!
docker machine-learning rasa-nlu rasa-core
docker machine-learning rasa-nlu rasa-core
asked Nov 20 '18 at 10:55
ArunArun
98221227
98221227
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
The command which you provided, starts the NLU server.
As your status is project not found
it seems that you have not yet provided a trained model.
You can either mount a directory, which contains the trained model, as Docker volume, e.g.:
docker run
-v nlu-models:/app/nlu-models # mounts the directory `nlu-models` in the container to `/app/nlu-models`
-p 5000:5000 # maps the container port 5000 to port 5000 of your host
rasa/rasa_nlu:latest-full # the Docker image
start --path /app/nlu-models # starts the NLU server and points it to the directory with the trained models`
The other option is to start the server with command from your question and then start a training on the server by sending the training data via POST request to the server (make sure your header specifies Content-Type: application/x-yml
). To do so, specify a file config_train_server.yml
which contains the configuration of your NLU pipeline and your training data, e.g.:
language: "en"
pipeline: "spacy_sklearn"
# data contains the same md, as described in the training data section
data: |
## intent:affirm
- yes
- yep
## intent:goodbye
- bye
- goodbye
Then you can send the content of the file via POST request to the server, e.g.:
curl -XPOST # POST request
-H "Content-Type: application/x-yml" # content header localhost:5000/train?project=my_project
-d @config_train_server.yml # pipeline config and training data as body of the POST request
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53391459%2fneed-help-to-setup-rasa-nlu-server-with-docker%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
The command which you provided, starts the NLU server.
As your status is project not found
it seems that you have not yet provided a trained model.
You can either mount a directory, which contains the trained model, as Docker volume, e.g.:
docker run
-v nlu-models:/app/nlu-models # mounts the directory `nlu-models` in the container to `/app/nlu-models`
-p 5000:5000 # maps the container port 5000 to port 5000 of your host
rasa/rasa_nlu:latest-full # the Docker image
start --path /app/nlu-models # starts the NLU server and points it to the directory with the trained models`
The other option is to start the server with command from your question and then start a training on the server by sending the training data via POST request to the server (make sure your header specifies Content-Type: application/x-yml
). To do so, specify a file config_train_server.yml
which contains the configuration of your NLU pipeline and your training data, e.g.:
language: "en"
pipeline: "spacy_sklearn"
# data contains the same md, as described in the training data section
data: |
## intent:affirm
- yes
- yep
## intent:goodbye
- bye
- goodbye
Then you can send the content of the file via POST request to the server, e.g.:
curl -XPOST # POST request
-H "Content-Type: application/x-yml" # content header localhost:5000/train?project=my_project
-d @config_train_server.yml # pipeline config and training data as body of the POST request
add a comment |
The command which you provided, starts the NLU server.
As your status is project not found
it seems that you have not yet provided a trained model.
You can either mount a directory, which contains the trained model, as Docker volume, e.g.:
docker run
-v nlu-models:/app/nlu-models # mounts the directory `nlu-models` in the container to `/app/nlu-models`
-p 5000:5000 # maps the container port 5000 to port 5000 of your host
rasa/rasa_nlu:latest-full # the Docker image
start --path /app/nlu-models # starts the NLU server and points it to the directory with the trained models`
The other option is to start the server with command from your question and then start a training on the server by sending the training data via POST request to the server (make sure your header specifies Content-Type: application/x-yml
). To do so, specify a file config_train_server.yml
which contains the configuration of your NLU pipeline and your training data, e.g.:
language: "en"
pipeline: "spacy_sklearn"
# data contains the same md, as described in the training data section
data: |
## intent:affirm
- yes
- yep
## intent:goodbye
- bye
- goodbye
Then you can send the content of the file via POST request to the server, e.g.:
curl -XPOST # POST request
-H "Content-Type: application/x-yml" # content header localhost:5000/train?project=my_project
-d @config_train_server.yml # pipeline config and training data as body of the POST request
add a comment |
The command which you provided, starts the NLU server.
As your status is project not found
it seems that you have not yet provided a trained model.
You can either mount a directory, which contains the trained model, as Docker volume, e.g.:
docker run
-v nlu-models:/app/nlu-models # mounts the directory `nlu-models` in the container to `/app/nlu-models`
-p 5000:5000 # maps the container port 5000 to port 5000 of your host
rasa/rasa_nlu:latest-full # the Docker image
start --path /app/nlu-models # starts the NLU server and points it to the directory with the trained models`
The other option is to start the server with command from your question and then start a training on the server by sending the training data via POST request to the server (make sure your header specifies Content-Type: application/x-yml
). To do so, specify a file config_train_server.yml
which contains the configuration of your NLU pipeline and your training data, e.g.:
language: "en"
pipeline: "spacy_sklearn"
# data contains the same md, as described in the training data section
data: |
## intent:affirm
- yes
- yep
## intent:goodbye
- bye
- goodbye
Then you can send the content of the file via POST request to the server, e.g.:
curl -XPOST # POST request
-H "Content-Type: application/x-yml" # content header localhost:5000/train?project=my_project
-d @config_train_server.yml # pipeline config and training data as body of the POST request
The command which you provided, starts the NLU server.
As your status is project not found
it seems that you have not yet provided a trained model.
You can either mount a directory, which contains the trained model, as Docker volume, e.g.:
docker run
-v nlu-models:/app/nlu-models # mounts the directory `nlu-models` in the container to `/app/nlu-models`
-p 5000:5000 # maps the container port 5000 to port 5000 of your host
rasa/rasa_nlu:latest-full # the Docker image
start --path /app/nlu-models # starts the NLU server and points it to the directory with the trained models`
The other option is to start the server with command from your question and then start a training on the server by sending the training data via POST request to the server (make sure your header specifies Content-Type: application/x-yml
). To do so, specify a file config_train_server.yml
which contains the configuration of your NLU pipeline and your training data, e.g.:
language: "en"
pipeline: "spacy_sklearn"
# data contains the same md, as described in the training data section
data: |
## intent:affirm
- yes
- yep
## intent:goodbye
- bye
- goodbye
Then you can send the content of the file via POST request to the server, e.g.:
curl -XPOST # POST request
-H "Content-Type: application/x-yml" # content header localhost:5000/train?project=my_project
-d @config_train_server.yml # pipeline config and training data as body of the POST request
answered Nov 20 '18 at 14:35
TobiasTobias
560311
560311
add a comment |
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53391459%2fneed-help-to-setup-rasa-nlu-server-with-docker%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown