Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Oppiskele Testing and Running the API | Model Deployment with FastAPI and Docker
MLOps Foundations

bookTesting and Running the API

Once you have containerized your FastAPI application and started the Docker container, you need to verify that the API is running correctly and returning predictions as expected. To run your Docker container, use a command like:

Replacing your_image_name with the name of your built image. This command maps port 8000 on your local machine to port 8000 inside the container, making the FastAPI app accessible at:

Testing the /predict endpoint can be done using command line tools like curl or by sending an HTTP request from Python. Always ensure your input data matches the expected format defined by your FastAPI model. For example, if your model expects a JSON payload with certain fields, your test requests should include those fields with appropriate sample values.

import requests

# Replace with the actual URL if running on a different host or port
url = "http://localhost:8000/predict"

# Example input data matching the expected schema of your FastAPI model
input_data = {
    "feature1": 3.5,
    "feature2": 1.2,
    "feature3": 0.8
}

response = requests.post(url, json=input_data)

if response.status_code == 200:
    print("Prediction:", response.json())
else:
    print("Error:", response.status_code, response.text)
Note
Note

Warning: always validate input data and handle errors gracefully in production APIs. Never assume that clients will always send well-formed or expected data. Use FastAPI's validation features and implement clear error messages to help users and protect your service from unexpected input.

question mark

Which HTTP method and payload format should you use to test the /predict endpoint of your FastAPI API for making prediction requests?

Select the correct answer

Oliko kaikki selvää?

Miten voimme parantaa sitä?

Kiitos palautteestasi!

Osio 3. Luku 3

Kysy tekoälyä

expand

Kysy tekoälyä

ChatGPT

Kysy mitä tahansa tai kokeile jotakin ehdotetuista kysymyksistä aloittaaksesi keskustelumme

Awesome!

Completion rate improved to 6.25

bookTesting and Running the API

Pyyhkäise näyttääksesi valikon

Once you have containerized your FastAPI application and started the Docker container, you need to verify that the API is running correctly and returning predictions as expected. To run your Docker container, use a command like:

Replacing your_image_name with the name of your built image. This command maps port 8000 on your local machine to port 8000 inside the container, making the FastAPI app accessible at:

Testing the /predict endpoint can be done using command line tools like curl or by sending an HTTP request from Python. Always ensure your input data matches the expected format defined by your FastAPI model. For example, if your model expects a JSON payload with certain fields, your test requests should include those fields with appropriate sample values.

import requests

# Replace with the actual URL if running on a different host or port
url = "http://localhost:8000/predict"

# Example input data matching the expected schema of your FastAPI model
input_data = {
    "feature1": 3.5,
    "feature2": 1.2,
    "feature3": 0.8
}

response = requests.post(url, json=input_data)

if response.status_code == 200:
    print("Prediction:", response.json())
else:
    print("Error:", response.status_code, response.text)
Note
Note

Warning: always validate input data and handle errors gracefully in production APIs. Never assume that clients will always send well-formed or expected data. Use FastAPI's validation features and implement clear error messages to help users and protect your service from unexpected input.

question mark

Which HTTP method and payload format should you use to test the /predict endpoint of your FastAPI API for making prediction requests?

Select the correct answer

Oliko kaikki selvää?

Miten voimme parantaa sitä?

Kiitos palautteestasi!

Osio 3. Luku 3
some-alt