Inference

With our model trained we can now use it for inference.

First we need to build our input data. We’ll put together 10 samples of X and Y but not provide Z.

n_forecastdata = 10
xis_forecast = np.random.randn(n_forecastdata, 2)
xs_forecast = np.ones(n_forecastdata)
xs_forecast[xis_forecast[:, 0] < 0] = -1
ys_forecast = np.ones(n_forecastdata)
ys_forecast[xis_forecast[:, 1] < 0] = -1
forecast_data = pandas.DataFrame(
    data=(np.vstack([xs_forecast, ys_forecast])).T,
    index=range(n_forecastdata),
    columns=["X", "Y"]
)

Now we will construct some inference parameters that will determine how the model is run on the inference data. Here we have to specify state parameters, which you can read more about these state parameters in the State Parameters section.

from qcog_python_client.schema.parameters import LOBPCGFastStateParameters

parameters = LOBPCGFastStateParameters(
    iterations=5
)

Finally we execute an inference call against our trained model, providing the forecast data and the parameters.

predicted_df = model.inference(
    forecast_data,
    parameters={
        "state_parameters": parameters
    }
)

You can print the dataframe returned and see how close it gets to what we expect.

Loading a pre-trained model

If you have trained a model in a different script, session, or process you can load it back into the qcml object and then use it for inference as was discussed in the training section.

qcml = qcml.preloaded_model(model_id)

Using the async client

The async client has the same interface except we have to await our inference call.

result_df = await model.inference(
    forecast_data,
    parameters={
        "state_parameters": parameters
    }
)

Next Steps

That’s it! You have all the pieces to get started with the QCML library. You can find some more complex examples in our examples section on the left. There are also a lot of options and parameters for you to explore to find the best fit to your problem space.

Try some more complicated examples and see how the model performs, or dive right into your own data.