Lomas Client Side: Using Smartnoise-Synth
This notebook showcases how researcher could use the Secure Data Disclosure system. It explains the different functionnalities provided by the lomas-client
client library to interact with the secure server.
The secure data are never visible by researchers. They can only access to differentially private responses via queries to the server.
Each user has access to one or multiple projects and for each dataset has a limited budget with \(\epsilon\) and \(\delta\) values.
Step 1: Install the library
To interact with the secure server on which the data is stored, Dr.Antartica first needs to install the library lomas-client
on her local developping environment.
It can be installed via the pip command:
[1]:
# !pip install lomas_client
Or using a local version of the client
[2]:
import sys
import os
sys.path.append(os.path.abspath(os.path.join('..')))
[3]:
from lomas_client import Client
import numpy as np
Step 2: Initialise the client
Once the library is installed, a Client object must be created. It is responsible for sending sending requests to the server and processing responses in the local environment. It enables a seamless interaction with the server.
To create the client, Dr. Antartica needs to give it a few parameters: - a url: the root application endpoint to the remote secure server. - user_name: her name as registered in the database (Dr. Alice Antartica) - dataset_name: the name of the dataset that she wants to query (PENGUIN)
She will only be able to query on the real dataset if the queen Icergina has previously made her an account in the database, given her access to the PENGUIN dataset and has given her some epsilon and delta credit (as is done in the Admin Notebook for Users and Datasets management).
[4]:
APP_URL = "http://lomas_server"
USER_NAME = "Dr. Antartica"
DATASET_NAME = "PENGUIN"
client = Client(url=APP_URL, user_name = USER_NAME, dataset_name = DATASET_NAME)
And that’s it for the preparation. She is now ready to use the various functionnalities offered by lomas-client
.
Step 3: Metadata and dummy dataset
Getting dataset metadata
Dr. Antartica has never seen the data and as a first step to understand what is available to her, she would like to check the metadata of the dataset. Therefore, she just needs to call the get_dataset_metadata()
function of the client. As this is public information, this does not cost any budget.
This function returns metadata information in a format based on SmartnoiseSQL dictionary format, where among other, there is information about all the available columns, their type, bound values (see Smartnoise page for more details). Any metadata is required for Smartnoise-SQL is also required here and additional information such that the different categories in a string type column column can be added.
[5]:
penguin_metadata = client.get_dataset_metadata()
penguin_metadata
[5]:
{'max_ids': 1,
'rows': 344,
'row_privacy': True,
'censor_dims': False,
'columns': {'species': {'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'string',
'cardinality': 3,
'categories': ['Adelie', 'Chinstrap', 'Gentoo']},
'island': {'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'string',
'cardinality': 3,
'categories': ['Torgersen', 'Biscoe', 'Dream']},
'bill_length_mm': {'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'float',
'precision': 64,
'lower': 30.0,
'upper': 65.0},
'bill_depth_mm': {'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'float',
'precision': 64,
'lower': 13.0,
'upper': 23.0},
'flipper_length_mm': {'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'float',
'precision': 64,
'lower': 150.0,
'upper': 250.0},
'body_mass_g': {'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'float',
'precision': 64,
'lower': 2000.0,
'upper': 7000.0},
'sex': {'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'string',
'cardinality': 2,
'categories': ['MALE', 'FEMALE']}}}
Step 3: Create a Synthetic Dataset keeping all default parameters
We want to get a synthetic model to represent the private data.
Therefore, we use a Smartnoise Synth Synthesizers.
Let’s list the potential options. There respective paramaters are then available in Smarntoise Synth documentation here.
[6]:
from snsynth import Synthesizer
Synthesizer.list_synthesizers()
[6]:
['mwem', 'dpctgan', 'patectgan', 'mst', 'pacsynth', 'dpgan', 'pategan', 'aim']
AIM: Adaptive Iterative Mechanism
We start by executing a query on the dummy dataset without specifying any special parameters for AIM (all optional kept as default). Also only works on categorical columns so we select “species” and “island” columns to create a synthetic dataset of these two columns.
[7]:
res_dummy = client.smartnoise_synth.query(
synth_name="aim",
epsilon=1.0,
delta=0.0001,
select_cols = ["species", "island"],
dummy=True,
)
res_dummy.result.df_samples
[7]:
species | island | |
---|---|---|
0 | Gentoo | Biscoe |
1 | Adelie | Biscoe |
2 | Gentoo | Dream |
3 | Chinstrap | Dream |
4 | Gentoo | Biscoe |
... | ... | ... |
195 | Gentoo | Torgersen |
196 | Chinstrap | |
197 | Chinstrap | Torgersen |
198 | Adelie | Biscoe |
199 | Gentoo | Dream |
200 rows × 2 columns
The algorithm works and returned a synthetic dataset. We now estimate the cost of running this command:
[8]:
res_cost = client.smartnoise_synth.cost(
synth_name="aim",
epsilon=1.0,
delta=0.0001,
select_cols = ["species", "island"],
)
res_cost
[8]:
CostResponse(epsilon=1.0, delta=0.0001)
Executing such a query on the private dataset would cost 1.0 epsilon and 0.0001 delta. Dr. Antartica decides to do it with now the flag dummmy
to False and specifiying that the wants the aim synthesizer model in return (with return_model = True
).
NOTE: if she does not set the parameter return_model = True
, then it is False by default and she will get a synthetic dataframe as response directly.
[9]:
res = client.smartnoise_synth.query(
synth_name="aim",
epsilon=1.0,
delta=0.0001,
select_cols = ["species", "island"],
dummy=True,
return_model = True
)
res.result.model
/usr/local/lib/python3.12/site-packages/mbi/__init__.py:15: UserWarning: MixtureInference disabled, please install jax and jaxlib
warnings.warn('MixtureInference disabled, please install jax and jaxlib')
[9]:
<snsynth.aim.aim.AIMSynthesizer at 0x71b3586e6a80>
She can now get the model and sample results with it. She choose to sample 10 samples.
[10]:
synth = res.result.model
synth.sample(10)
[10]:
species | island | |
---|---|---|
0 | Gentoo | Torgersen |
1 | Gentoo | Biscoe |
2 | Chinstrap | Biscoe |
3 | Gentoo | Dream |
4 | Adelie | Dream |
5 | Chinstrap | Biscoe |
6 | Gentoo | Dream |
7 | Adelie | Torgersen |
8 | Chinstrap | Dream |
9 | Gentoo | Torgersen |
She now wants to specify some specific parameters to the AIM model. Therefore, she needs to set some parameters in synth_params
based on the Smartnoise-Synth documentation here. She decides that she wants to modify the max_model_size
to 50 (the default was 80) and tries on the dummy.
[11]:
res_dummy = client.smartnoise_synth.query(
synth_name="aim",
epsilon=1.0,
delta=0.0001,
select_cols = ["species", "island"],
dummy=True,
return_model = True,
synth_params = {"max_model_size": 50}
)
res_dummy.result.model
[11]:
<snsynth.aim.aim.AIMSynthesizer at 0x71b321d293d0>
[12]:
synth = res_dummy.result.model
synth.sample(5)
[12]:
species | island | |
---|---|---|
0 | Gentoo | Torgersen |
1 | Chinstrap | Dream |
2 | Gentoo | Biscoe |
3 | Chinstrap | Biscoe |
4 | Adelie | Dream |
Now that the workflow is understood for AIM, she wants to experiment with various synthesizer on the dummy.
MWEM: Multiplicative Weights Exponential Mechanism
She tries MWEM on all columns with all default parameters. As return_model
is not specified she will directly receive a synthetic dataframe back.
[13]:
res_dummy = client.smartnoise_synth.query(
synth_name="mwem",
epsilon=1.0,
dummy=True,
)
res_dummy.result.df_samples.head()
[13]:
species | island | bill_length_mm | bill_depth_mm | flipper_length_mm | body_mass_g | sex | |
---|---|---|---|---|---|---|---|
0 | Adelie | Dream | 56.25 | 22.5 | 165.0 | 4750.0 | FEMALE |
1 | Gentoo | Dream | 35.25 | 18.5 | 205.0 | 2750.0 | FEMALE |
2 | Chinstrap | Biscoe | 49.25 | 13.5 | 215.0 | 2250.0 | FEMALE |
3 | Adelie | Biscoe | 35.25 | 13.5 | 225.0 | 4750.0 | MALE |
4 | Adelie | Dream | 56.25 | 22.5 | 165.0 | 4750.0 | FEMALE |
She now specifies 3 columns and some parameters explained here.
[14]:
res_dummy = client.smartnoise_synth.query(
synth_name="mwem",
epsilon=1.0,
select_cols = ["species", "island", "sex"],
synth_params = {"measure_only": False, "max_retries_exp_mechanism": 5},
dummy=True,
)
res_dummy.result.df_samples.head()
[14]:
species | island | sex | |
---|---|---|---|
0 | Chinstrap | Torgersen | FEMALE |
1 | Adelie | Torgersen | FEMALE |
2 | Adelie | Torgersen | FEMALE |
3 | Chinstrap | Torgersen | FEMALE |
4 | Gentoo | Biscoe | MALE |
Finally it MWEM, she wants to go more in depth and create her own data preparation pipeline. Therefore, she can use Smartnoise-Synth “Data Transformers” explained here and send her own constraints dictionnary for specific steps. This is more for advanced user.
By default, if no constraints are specified, the server creates its automatically a data transformer based on selected columns, synthesizer and metadata.
Here she wants to add a clamping transformation on the continuous columns before training the synthesizer. She add the bounds based on metadata.
[15]:
bl_bounds = penguin_metadata["columns"]["bill_length_mm"]
bd_bounds = penguin_metadata["columns"]["bill_depth_mm"]
bl_bounds, bd_bounds
[15]:
({'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'float',
'precision': 64,
'lower': 30.0,
'upper': 65.0},
{'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'float',
'precision': 64,
'lower': 13.0,
'upper': 23.0})
[16]:
from snsynth.transform import BinTransformer, ClampTransformer, ChainTransformer, LabelTransformer
my_own_constraints = {
"bill_length_mm": ChainTransformer(
[
ClampTransformer(lower = bl_bounds["lower"] + 10, upper = bl_bounds["upper"] - 10),
BinTransformer(bins = 20, lower = bl_bounds["lower"] + 10, upper = bl_bounds["upper"] - 10),
]
),
"bill_depth_mm": ChainTransformer(
[
ClampTransformer(lower = bd_bounds["lower"] + 2, upper = bd_bounds["upper"] - 2),
BinTransformer(bins=20, lower = bd_bounds["lower"] + 2, upper = bd_bounds["upper"] - 2),
]
),
"species": LabelTransformer(nullable=True)
}
[17]:
res_dummy = client.smartnoise_synth.query(
synth_name="mwem",
epsilon=1.0,
select_cols = ["bill_length_mm", "bill_depth_mm", "species"],
constraints = my_own_constraints,
dummy=True,
)
res_dummy.result.df_samples.head()
[17]:
bill_length_mm | bill_depth_mm | species | |
---|---|---|---|
0 | 47.875 | 15.15 | Chinstrap |
1 | 48.625 | 15.45 | Gentoo |
2 | 55.000 | 20.85 | Adelie |
3 | 47.875 | 15.15 | Chinstrap |
4 | 47.875 | 15.15 | Chinstrap |
Also a subset of constraints can be specified for certain columns and the server will automatically generate those for the missing columns.
[18]:
my_own_constraints = {
"bill_length_mm": ChainTransformer(
[
ClampTransformer(lower = bl_bounds["lower"] + 10, upper = bl_bounds["upper"] - 10),
BinTransformer(bins = 20, lower = bl_bounds["lower"] + 10, upper = bl_bounds["upper"] - 10),
]
)
}
In this case, only the bill_length will be clamped.
[19]:
res_dummy = client.smartnoise_synth.query(
synth_name="mwem",
epsilon=1.0,
select_cols = ["bill_length_mm", "bill_depth_mm", "species"],
constraints = my_own_constraints,
dummy=True,
)
res_dummy.result.df_samples.head()
[19]:
bill_length_mm | bill_depth_mm | species | |
---|---|---|---|
0 | 54.625 | 14.5 | Adelie |
1 | 40.375 | 13.5 | Gentoo |
2 | 54.625 | 14.5 | Adelie |
3 | 54.625 | 14.5 | Adelie |
4 | 54.625 | 14.5 | Adelie |
MST: Maximum Spanning Tree
She now experiments with MST. As the synthesizer is very needy in terms of computation, she selects a subset of column for it. See MST here.
[20]:
res_dummy = client.smartnoise_synth.query(
synth_name="mst",
epsilon=1.0,
select_cols = ["species", "sex"],
dummy=True,
)
res_dummy.result.df_samples.head()
[20]:
species | sex | |
---|---|---|
0 | Chinstrap | FEMALE |
1 | ||
2 | Chinstrap | |
3 | ||
4 | Gentoo | MALE |
She can also specify a specific number of samples to get (if return_model is not True):
[21]:
res_dummy = client.smartnoise_synth.query(
synth_name="mst",
epsilon=1.0,
select_cols = ["species", "sex"],
nb_samples = 4,
dummy=True,
)
res_dummy.result.df_samples
[21]:
species | sex | |
---|---|---|
0 | FEMALE | |
1 | Gentoo | MALE |
2 | Gentoo | FEMALE |
3 | Chinstrap | MALE |
And a condition on these samples. For instance, here, she only wants female samples.
[22]:
res_dummy = client.smartnoise_synth.query(
synth_name="mst",
epsilon=1.0,
select_cols = ["sex", "species"],
nb_samples = 4,
condition = "sex = FEMALE",
dummy=True,
)
res_dummy.result.df_samples
[22]:
sex | species | |
---|---|---|
0 | Gentoo | |
1 | Gentoo | |
2 | Gentoo | |
3 | Gentoo |
DPCTGAN: Differentially Private Conditional Tabular GAN
She now tries DPCTGAN. A first warning let her know that the random noise generation for this model is not cryptographically secure and if it is not ok for her, she can decode to stop using this synthesizer. Then she does not get a response but an error 422 with an explanation.
[23]:
res_dummy = client.smartnoise_synth.query(
synth_name="dpctgan",
epsilon=1.0,
dummy=True,
)
res_dummy
/code/lomas_client/utils.py:62: UserWarning: Warning:dpctgan synthesizer random generator for noise and shuffling is not cryptographically secure. (pseudo-rng in vanilla PyTorch).
warnings.warn(
---------------------------------------------------------------------------
ExternalLibraryException Traceback (most recent call last)
Cell In[23], line 1
----> 1 res_dummy = client.smartnoise_synth.query(
2 synth_name="dpctgan",
3 epsilon=1.0,
4 dummy=True,
5 )
6 res_dummy
File /code/lomas_client/libraries/smartnoise_synth.py:203, in SmartnoiseSynthClient.query(self, synth_name, epsilon, delta, select_cols, synth_params, nullable, constraints, dummy, return_model, condition, nb_samples, nb_rows, seed)
200 r_model = QueryResponse.model_validate_json(data)
201 return r_model
--> 203 raise_error(res)
204 return None
File /code/lomas_client/utils.py:38, in raise_error(response)
36 raise InvalidQueryException(error_message["InvalidQueryException"])
37 if response.status_code == status.HTTP_422_UNPROCESSABLE_ENTITY:
---> 38 raise ExternalLibraryException(
39 error_message["library"], error_message["ExternalLibraryException"]
40 )
41 if response.status_code == status.HTTP_403_FORBIDDEN:
42 raise UnauthorizedAccessException(error_message["UnauthorizedAccessException"])
ExternalLibraryException: ('smartnoise_synth', 'Error fitting model: sample_rate=5.0 is not a valid value. Please provide a float between 0 and 1. Try decreasing batch_size in synth_params (default batch_size=500).')
The default parameters of DPCTGAN do not work for PENGUIN dataset. Hence, as advised in the error message, she decreases the batch_size (also she checks the documentation here.
[24]:
res_dummy = client.smartnoise_synth.query(
synth_name="dpctgan",
epsilon=1.0,
synth_params = {"batch_size": 50},
dummy=True,
)
res_dummy.result.df_samples.head()
[24]:
species | island | bill_length_mm | bill_depth_mm | flipper_length_mm | body_mass_g | sex | |
---|---|---|---|---|---|---|---|
0 | Adelie | Dream | 43.414108 | 17.841402 | 180.284642 | 5016.072102 | FEMALE |
1 | Gentoo | Biscoe | 43.298852 | 16.777365 | 222.225340 | 5162.192479 | MALE |
2 | Adelie | Dream | 50.622394 | 19.280649 | 209.893867 | 5275.184557 | FEMALE |
3 | Chinstrap | Biscoe | 41.493216 | 17.206660 | 233.323157 | 2938.070863 | FEMALE |
4 | Adelie | Biscoe | 46.749278 | 17.139504 | 204.060608 | 5795.609772 | MALE |
[ ]:
PATEGAN: Private Aggregation of Teacher Ensembles
Unfortunatelly, she is not able to train the pategan synthetizer on the PENGUIN dataset. Hence, she must try another one.
[25]:
res_dummy = client.smartnoise_synth.query(
synth_name="pategan",
epsilon=1.0,
dummy=True,
)
res_dummy
---------------------------------------------------------------------------
ExternalLibraryException Traceback (most recent call last)
Cell In[25], line 1
----> 1 res_dummy = client.smartnoise_synth.query(
2 synth_name="pategan",
3 epsilon=1.0,
4 dummy=True,
5 )
6 res_dummy
File /code/lomas_client/libraries/smartnoise_synth.py:203, in SmartnoiseSynthClient.query(self, synth_name, epsilon, delta, select_cols, synth_params, nullable, constraints, dummy, return_model, condition, nb_samples, nb_rows, seed)
200 r_model = QueryResponse.model_validate_json(data)
201 return r_model
--> 203 raise_error(res)
204 return None
File /code/lomas_client/utils.py:38, in raise_error(response)
36 raise InvalidQueryException(error_message["InvalidQueryException"])
37 if response.status_code == status.HTTP_422_UNPROCESSABLE_ENTITY:
---> 38 raise ExternalLibraryException(
39 error_message["library"], error_message["ExternalLibraryException"]
40 )
41 if response.status_code == status.HTTP_403_FORBIDDEN:
42 raise UnauthorizedAccessException(error_message["UnauthorizedAccessException"])
ExternalLibraryException: ('smartnoise_synth', 'pategan not reliable with this dataset.')
PATECTGAN: Conditional tabular GAN using Private Aggregation of Teacher Ensembles
[26]:
res_dummy = client.smartnoise_synth.query(
synth_name="patectgan",
epsilon=1.0,
dummy=True,
)
res_dummy.result.df_samples.head()
[26]:
species | island | bill_length_mm | bill_depth_mm | flipper_length_mm | body_mass_g | sex | |
---|---|---|---|---|---|---|---|
0 | Adelie | Torgersen | 44.965223 | 21.050138 | 197.011286 | 3798.269078 | MALE |
1 | Chinstrap | Biscoe | 54.784711 | 18.795483 | 189.339603 | 4936.383002 | MALE |
2 | Chinstrap | Biscoe | 58.836415 | 14.854715 | 201.541473 | 4849.516831 | MALE |
3 | Gentoo | Dream | 49.260641 | 19.661433 | 245.845395 | 4142.061740 | FEMALE |
4 | Gentoo | Torgersen | 48.662708 | 17.788002 | 177.374248 | 5917.481452 | FEMALE |
[27]:
res_dummy = client.smartnoise_synth.query(
synth_name="patectgan",
epsilon=1.0,
select_cols = ["island", "bill_length_mm", "body_mass_g"],
synth_params = {
"embedding_dim": 256,
"generator_dim": (128, 128),
"discriminator_dim": (256, 256),
"generator_lr": 0.0003,
"generator_decay": 1e-05,
"discriminator_lr": 0.0003,
"discriminator_decay": 1e-05,
"batch_size": 500
},
nb_samples = 100,
dummy=True,
)
res_dummy.result.df_samples.head()
[27]:
island | bill_length_mm | body_mass_g | |
---|---|---|---|
0 | Dream | 51.295550 | 4649.196619 |
1 | Biscoe | 38.369172 | 4301.166393 |
2 | Biscoe | 52.136779 | 4498.011571 |
3 | Torgersen | 58.900825 | 4223.040946 |
4 | Dream | 40.492166 | 3707.417592 |
DPGAN: DIfferentially Private GAN
For DPGAN, there is the same warning as for DPCTGAN with the cryptographically secure random noise generation.
[28]:
res_dummy = client.smartnoise_synth.query(
synth_name="dpgan",
epsilon=1.0,
dummy=True,
)
res_dummy.result.df_samples.head()
/code/lomas_client/utils.py:62: UserWarning: Warning:dpgan synthesizer random generator for noise and shuffling is not cryptographically secure. (pseudo-rng in vanilla PyTorch).
warnings.warn(
[28]:
species | island | bill_length_mm | bill_depth_mm | flipper_length_mm | body_mass_g | sex | |
---|---|---|---|---|---|---|---|
0 | Gentoo | Biscoe | 61.084300 | 17.778250 | 202.404261 | 4074.338235 | MALE |
1 | Adelie | Dream | 45.143127 | 23.000000 | 250.000000 | 4078.621872 | MALE |
2 | Gentoo | Biscoe | 63.310050 | 16.944589 | 215.155567 | 3999.723613 | FEMALE |
3 | Gentoo | Dream | 65.000000 | 22.198413 | 218.926238 | 7000.000000 | MALE |
4 | Adelie | Dream | 65.000000 | 23.000000 | 191.299780 | 4249.239404 | MALE |
One final time she samples with conditions:
[29]:
res_dummy = client.smartnoise_synth.query(
synth_name="dpgan",
epsilon=1.0,
condition = "body_mass_g > 5000",
dummy=True,
)
res_dummy.result.df_samples.head()
/code/lomas_client/utils.py:62: UserWarning: Warning:dpgan synthesizer random generator for noise and shuffling is not cryptographically secure. (pseudo-rng in vanilla PyTorch).
warnings.warn(
[29]:
species | island | bill_length_mm | bill_depth_mm | flipper_length_mm | body_mass_g | sex | |
---|---|---|---|---|---|---|---|
0 | Gentoo | Torgersen | 48.614252 | 17.252423 | 250.000000 | 7000.000000 | FEMALE |
1 | Gentoo | Torgersen | 62.443527 | 17.991540 | 250.000000 | 7000.000000 | FEMALE |
2 | Chinstrap | Dream | 65.000000 | 23.000000 | 226.908019 | 7000.000000 | MALE |
3 | Gentoo | Dream | 60.141646 | 16.770572 | 246.724272 | 5726.566434 | MALE |
4 | Adelie | Torgersen | 46.260255 | 16.974378 | 250.000000 | 6849.641472 | MALE |
And now on the real dataset
[30]:
res_dummy = client.smartnoise_synth.query(
synth_name="dpgan",
epsilon=1.0,
condition = "body_mass_g > 5000",
nb_samples = 10,
dummy=False,
)
res_dummy.result.df_samples
/code/lomas_client/utils.py:62: UserWarning: Warning:dpgan synthesizer random generator for noise and shuffling is not cryptographically secure. (pseudo-rng in vanilla PyTorch).
warnings.warn(
[30]:
species | island | bill_length_mm | bill_depth_mm | flipper_length_mm | body_mass_g | sex | |
---|---|---|---|---|---|---|---|
0 | Adelie | Biscoe | 44.275917 | 23.000000 | 194.386986 | 5710.911572 | |
1 | Chinstrap | Biscoe | 45.761536 | 19.180464 | 190.228606 | 5585.222661 | FEMALE |
2 | Chinstrap | Biscoe | 51.918343 | 20.711846 | 250.000000 | 5547.108099 | |
3 | Adelie | Dream | 65.000000 | 23.000000 | 193.761142 | 7000.000000 | |
4 | Chinstrap | Dream | 65.000000 | 23.000000 | 244.220206 | 6518.389255 | MALE |
5 | Adelie | Torgersen | 61.533132 | 20.927101 | 186.077987 | 5242.271543 | MALE |
6 | Chinstrap | Dream | 46.066000 | 20.364783 | 198.249876 | 5248.364478 | |
7 | Adelie | Torgersen | 63.791512 | 17.969750 | 199.137564 | 7000.000000 | |
8 | Adelie | Dream | 65.000000 | 16.838814 | 180.955905 | 6145.199358 | |
9 | Adelie | Biscoe | 65.000000 | 19.727768 | 183.611000 | 7000.000000 |
Step 6: See archives of queries
She now wants to verify all the queries that she did on the real data. It is possible because an archive of all queries is kept in a secure database. With a function call she can see her queries, budget and associated responses.
[31]:
previous_queries = client.get_previous_queries()
Let’s check the last query
[32]:
last_query = previous_queries[-1]
last_query
[32]:
{'user_name': 'Dr. Antartica',
'dataset_name': 'PENGUIN',
'dp_librairy': 'smartnoise_synth',
'client_input': {'dataset_name': 'PENGUIN',
'synth_name': 'dpgan',
'epsilon': 1.0,
'delta': None,
'select_cols': [],
'synth_params': {},
'nullable': True,
'constraints': '',
'return_model': False,
'condition': 'body_mass_g > 5000',
'nb_samples': 10},
'response': {'epsilon': 1.0,
'delta': 0.00015673368198174188,
'requested_by': 'Dr. Antartica',
'result': res_type \
index sn_synth_samples
columns sn_synth_samples
data sn_synth_samples
index_names sn_synth_samples
column_names sn_synth_samples
df_samples
index [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
columns [species, island, bill_length_mm, bill_depth_m...
data [[Adelie, Biscoe, 44.27591737359762, 23.0, 194...
index_names [None]
column_names [None] },
'timestamp': 1728461702.0089684}
[ ]: