Skip to content

Commit af026fb

Browse files
authored
Merge pull request #2158 from FedML-AI/raphael/refactor-quick-start
[Deploy] Refactor the quick start example, use public ip as default.
2 parents 5214078 + ea03b60 commit af026fb

15 files changed

Lines changed: 42 additions & 512 deletions

File tree

Lines changed: 4 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -1,21 +1,8 @@
1-
workspace: "./src"
1+
workspace: "."
22
entry_point: "main_entry.py"
3+
34
# If you want to install some packages
45
# Please write the command in the bootstrap.sh
56
bootstrap: |
6-
echo "Bootstrap start..."
7-
sh ./config/bootstrap.sh
8-
echo "Bootstrap finished"
9-
10-
# If you do not have any GPU resource but want to serve the model
11-
# Try FedML® Nexus AI Platform, and Uncomment the following lines.
12-
# ------------------------------------------------------------
13-
computing:
14-
minimum_num_gpus: 1 # minimum # of GPUs to provision
15-
maximum_cost_per_hour: $3000 # max cost per hour for your job per gpu card
16-
#allow_cross_cloud_resources: true # true, false
17-
#device_type: CPU # options: GPU, CPU, hybrid
18-
resource_type: A100-80G # e.g., A100-80G,
19-
# please check the resource type list by "fedml show-resource-type"
20-
# or visiting URL: https://open.fedml.ai/accelerator_resource_type
21-
# ------------------------------------------------------------
7+
echo "Install some packages..."
8+
echo "Install finished!"
Lines changed: 27 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,27 @@
1+
from fedml.serving import FedMLPredictor
2+
from fedml.serving import FedMLInferenceRunner
3+
4+
5+
class Bot(FedMLPredictor): # Inherit FedMLClientPredictor
6+
def __init__(self):
7+
super().__init__()
8+
9+
# --- Your model initialization code here ---
10+
11+
# -------------------------------------------
12+
13+
def predict(self, request: dict):
14+
input_dict = request
15+
question: str = input_dict.get("text", "").strip()
16+
17+
# --- Your model inference code here ---
18+
response = "I do not know the answer to your question."
19+
# ---------------------------------------
20+
21+
return {"generated_text": f"The answer to your question {question} is: {response}"}
22+
23+
24+
if __name__ == "__main__":
25+
chatbot = Bot()
26+
fedml_inference_runner = FedMLInferenceRunner(chatbot)
27+
fedml_inference_runner.run()

python/examples/deploy/quick_start/src/__init__.py

Whitespace-only changes.

python/examples/deploy/quick_start/src/app/__init__.py

Whitespace-only changes.

python/examples/deploy/quick_start/src/app/pipe/__init__.py

Whitespace-only changes.

python/examples/deploy/quick_start/src/app/pipe/constants.py

Lines changed: 0 additions & 68 deletions
This file was deleted.

python/examples/deploy/quick_start/src/app/pipe/instruct_pipeline.py

Lines changed: 0 additions & 261 deletions
This file was deleted.

0 commit comments

Comments
 (0)