S3 example
Step 1: Install the library
To interact with the secure server on which the data is stored, one first needs to install the library lomas-client
on her local developping environment.
It can be installed via the pip command:
[1]:
!pip install lomas-client
Requirement already satisfied: lomas-client in /usr/local/lib/python3.12/site-packages (0.3.3)
Requirement already satisfied: diffprivlib>=0.6.4 in /usr/local/lib/python3.12/site-packages (from lomas-client) (0.6.4)
Requirement already satisfied: diffprivlib-logger>=0.0.3 in /usr/local/lib/python3.12/site-packages (from lomas-client) (0.0.3)
Requirement already satisfied: numpy>=1.26.2 in /usr/local/lib/python3.12/site-packages (from lomas-client) (1.26.2)
Requirement already satisfied: opendp==0.10.0 in /usr/local/lib/python3.12/site-packages (from lomas-client) (0.10.0)
Requirement already satisfied: opendp-logger==0.3.0 in /usr/local/lib/python3.12/site-packages (from lomas-client) (0.3.0)
Requirement already satisfied: pandas>=2.2.2 in /usr/local/lib/python3.12/site-packages (from lomas-client) (2.2.2)
Requirement already satisfied: requests>=2.32.0 in /usr/local/lib/python3.12/site-packages (from lomas-client) (2.32.0)
Requirement already satisfied: scikit-learn==1.4.0 in /usr/local/lib/python3.12/site-packages (from lomas-client) (1.4.0)
Requirement already satisfied: smartnoise-synth==1.0.4 in /usr/local/lib/python3.12/site-packages (from lomas-client) (1.0.4)
Requirement already satisfied: smartnoise-synth-logger==0.0.3 in /usr/local/lib/python3.12/site-packages (from lomas-client) (0.0.3)
Requirement already satisfied: scipy>=1.6.0 in /usr/local/lib/python3.12/site-packages (from scikit-learn==1.4.0->lomas-client) (1.14.1)
Requirement already satisfied: joblib>=1.2.0 in /usr/local/lib/python3.12/site-packages (from scikit-learn==1.4.0->lomas-client) (1.4.2)
Requirement already satisfied: threadpoolctl>=2.0.0 in /usr/local/lib/python3.12/site-packages (from scikit-learn==1.4.0->lomas-client) (3.5.0)
Requirement already satisfied: Faker>=17.0.0 in /usr/local/lib/python3.12/site-packages (from smartnoise-synth==1.0.4->lomas-client) (30.1.0)
Requirement already satisfied: opacus<0.15.0,>=0.14.0 in /usr/local/lib/python3.12/site-packages (from smartnoise-synth==1.0.4->lomas-client) (0.14.0)
Requirement already satisfied: pac-synth<0.0.9,>=0.0.8 in /usr/local/lib/python3.12/site-packages (from smartnoise-synth==1.0.4->lomas-client) (0.0.8)
Requirement already satisfied: smartnoise-sql<2.0.0,>=1.0.4 in /usr/local/lib/python3.12/site-packages (from smartnoise-synth==1.0.4->lomas-client) (1.0.4)
Requirement already satisfied: torch>=2.2.0 in /usr/local/lib/python3.12/site-packages (from smartnoise-synth==1.0.4->lomas-client) (2.4.1)
Requirement already satisfied: setuptools>=49.0.0 in /usr/local/lib/python3.12/site-packages (from diffprivlib>=0.6.4->lomas-client) (75.1.0)
Requirement already satisfied: python-dateutil>=2.8.2 in /usr/local/lib/python3.12/site-packages (from pandas>=2.2.2->lomas-client) (2.9.0.post0)
Requirement already satisfied: pytz>=2020.1 in /usr/local/lib/python3.12/site-packages (from pandas>=2.2.2->lomas-client) (2024.2)
Requirement already satisfied: tzdata>=2022.7 in /usr/local/lib/python3.12/site-packages (from pandas>=2.2.2->lomas-client) (2024.2)
Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/lib/python3.12/site-packages (from requests>=2.32.0->lomas-client) (3.4.0)
Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.12/site-packages (from requests>=2.32.0->lomas-client) (3.10)
Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/local/lib/python3.12/site-packages (from requests>=2.32.0->lomas-client) (2.2.3)
Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.12/site-packages (from requests>=2.32.0->lomas-client) (2024.8.30)
Requirement already satisfied: typing-extensions in /usr/local/lib/python3.12/site-packages (from Faker>=17.0.0->smartnoise-synth==1.0.4->lomas-client) (4.12.2)
Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.12/site-packages (from python-dateutil>=2.8.2->pandas>=2.2.2->lomas-client) (1.16.0)
Requirement already satisfied: PyYAML<7.0.0,>=6.0.1 in /usr/local/lib/python3.12/site-packages (from smartnoise-sql<2.0.0,>=1.0.4->smartnoise-synth==1.0.4->lomas-client) (6.0.2)
Requirement already satisfied: antlr4-python3-runtime==4.9.3 in /usr/local/lib/python3.12/site-packages (from smartnoise-sql<2.0.0,>=1.0.4->smartnoise-synth==1.0.4->lomas-client) (4.9.3)
Requirement already satisfied: graphviz<0.18,>=0.17 in /usr/local/lib/python3.12/site-packages (from smartnoise-sql<2.0.0,>=1.0.4->smartnoise-synth==1.0.4->lomas-client) (0.17)
Requirement already satisfied: sqlalchemy<3.0.0,>=2.0.0 in /usr/local/lib/python3.12/site-packages (from smartnoise-sql<2.0.0,>=1.0.4->smartnoise-synth==1.0.4->lomas-client) (2.0.35)
Requirement already satisfied: filelock in /usr/local/lib/python3.12/site-packages (from torch>=2.2.0->smartnoise-synth==1.0.4->lomas-client) (3.16.1)
Requirement already satisfied: sympy in /usr/local/lib/python3.12/site-packages (from torch>=2.2.0->smartnoise-synth==1.0.4->lomas-client) (1.13.3)
Requirement already satisfied: networkx in /usr/local/lib/python3.12/site-packages (from torch>=2.2.0->smartnoise-synth==1.0.4->lomas-client) (3.3)
Requirement already satisfied: jinja2 in /usr/local/lib/python3.12/site-packages (from torch>=2.2.0->smartnoise-synth==1.0.4->lomas-client) (3.1.4)
Requirement already satisfied: fsspec in /usr/local/lib/python3.12/site-packages (from torch>=2.2.0->smartnoise-synth==1.0.4->lomas-client) (2024.9.0)
Requirement already satisfied: nvidia-cuda-nvrtc-cu12==12.1.105 in /usr/local/lib/python3.12/site-packages (from torch>=2.2.0->smartnoise-synth==1.0.4->lomas-client) (12.1.105)
Requirement already satisfied: nvidia-cuda-runtime-cu12==12.1.105 in /usr/local/lib/python3.12/site-packages (from torch>=2.2.0->smartnoise-synth==1.0.4->lomas-client) (12.1.105)
Requirement already satisfied: nvidia-cuda-cupti-cu12==12.1.105 in /usr/local/lib/python3.12/site-packages (from torch>=2.2.0->smartnoise-synth==1.0.4->lomas-client) (12.1.105)
Requirement already satisfied: nvidia-cudnn-cu12==9.1.0.70 in /usr/local/lib/python3.12/site-packages (from torch>=2.2.0->smartnoise-synth==1.0.4->lomas-client) (9.1.0.70)
Requirement already satisfied: nvidia-cublas-cu12==12.1.3.1 in /usr/local/lib/python3.12/site-packages (from torch>=2.2.0->smartnoise-synth==1.0.4->lomas-client) (12.1.3.1)
Requirement already satisfied: nvidia-cufft-cu12==11.0.2.54 in /usr/local/lib/python3.12/site-packages (from torch>=2.2.0->smartnoise-synth==1.0.4->lomas-client) (11.0.2.54)
Requirement already satisfied: nvidia-curand-cu12==10.3.2.106 in /usr/local/lib/python3.12/site-packages (from torch>=2.2.0->smartnoise-synth==1.0.4->lomas-client) (10.3.2.106)
Requirement already satisfied: nvidia-cusolver-cu12==11.4.5.107 in /usr/local/lib/python3.12/site-packages (from torch>=2.2.0->smartnoise-synth==1.0.4->lomas-client) (11.4.5.107)
Requirement already satisfied: nvidia-cusparse-cu12==12.1.0.106 in /usr/local/lib/python3.12/site-packages (from torch>=2.2.0->smartnoise-synth==1.0.4->lomas-client) (12.1.0.106)
Requirement already satisfied: nvidia-nccl-cu12==2.20.5 in /usr/local/lib/python3.12/site-packages (from torch>=2.2.0->smartnoise-synth==1.0.4->lomas-client) (2.20.5)
Requirement already satisfied: nvidia-nvtx-cu12==12.1.105 in /usr/local/lib/python3.12/site-packages (from torch>=2.2.0->smartnoise-synth==1.0.4->lomas-client) (12.1.105)
Requirement already satisfied: triton==3.0.0 in /usr/local/lib/python3.12/site-packages (from torch>=2.2.0->smartnoise-synth==1.0.4->lomas-client) (3.0.0)
Requirement already satisfied: nvidia-nvjitlink-cu12 in /usr/local/lib/python3.12/site-packages (from nvidia-cusolver-cu12==11.4.5.107->torch>=2.2.0->smartnoise-synth==1.0.4->lomas-client) (12.6.77)
Requirement already satisfied: greenlet!=0.4.17 in /usr/local/lib/python3.12/site-packages (from sqlalchemy<3.0.0,>=2.0.0->smartnoise-sql<2.0.0,>=1.0.4->smartnoise-synth==1.0.4->lomas-client) (3.1.1)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.12/site-packages (from jinja2->torch>=2.2.0->smartnoise-synth==1.0.4->lomas-client) (2.1.5)
Requirement already satisfied: mpmath<1.4,>=1.1.0 in /usr/local/lib/python3.12/site-packages (from sympy->torch>=2.2.0->smartnoise-synth==1.0.4->lomas-client) (1.3.0)
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager, possibly rendering your system unusable.It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv. Use the --root-user-action option if you know what you are doing and want to suppress this warning.
[1]:
from lomas_client.client import Client
import numpy as np
Step 2: Initialise the client
Once the library is installed, a Client object must be created. It is responsible for sending sending requests to the server and processing responses in the local environment. It enables a seamless interaction with the server.
To create the client, one needs to give it a few parameters: - a url: the root application endpoint to the remote secure server. - user_name: her name as registered in the database (Jack) - dataset_name: the name of the dataset that we want to query (TITANIC)
[2]:
APP_URL = "http://lomas_server"
USER_NAME = "Jack"
DATASET_NAME = "TITANIC"
client = Client(url=APP_URL, user_name = USER_NAME, dataset_name = DATASET_NAME)
Step 3: Understand the functionnalities of the library
Getting dataset metadata
[3]:
titanic_metadata = client.get_dataset_metadata()
titanic_metadata
[3]:
{'max_ids': 1,
'rows': 887,
'row_privacy': True,
'censor_dims': False,
'columns': {'Pclass': {'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'int',
'precision': 32,
'lower': 1,
'upper': 3},
'Name': {'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'string'},
'Sex': {'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'string',
'cardinality': 2,
'categories': ['male', 'female']},
'Age': {'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'float',
'precision': 64,
'lower': 0.1,
'upper': 100.0},
'SibSp': {'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'int',
'precision': 32,
'lower': 0,
'upper': 10},
'Parch': {'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'int',
'precision': 32,
'lower': 0,
'upper': 10},
'Ticket': {'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'string'},
'Fare': {'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'float',
'precision': 64,
'lower': 0.0,
'upper': 1000.0},
'Cabin': {'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'string'},
'Embarked': {'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'string',
'cardinality': 3,
'categories': ['C', 'Q', 'S']},
'Survived': {'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'boolean'}}}
Get a dummy dataset
[4]:
NB_ROWS = 200
SEED = 0
[5]:
df_dummy = client.get_dummy_dataset(
nb_rows = NB_ROWS,
seed = SEED
)
print(df_dummy.shape)
df_dummy.head()
(200, 11)
[5]:
Pclass | Name | Sex | Age | SibSp | Parch | Ticket | Fare | Cabin | Embarked | Survived | |
---|---|---|---|---|---|---|---|---|---|---|---|
0 | 3 | o | female | 89.690443 | 6 | 6 | 2 | 858.435326 | U | S | True |
1 | 2 | D | male | 58.373673 | 0 | 0 | Z | 620.908898 | a | C | True |
2 | 2 | u | female | 4.117800 | 2 | 4 | h | 193.917948 | G | S | True |
3 | 1 | o | male | 71.177534 | 9 | 7 | a | 687.914521 | Z | Q | True |
4 | 1 | 3 | male | 56.945683 | 4 | 10 | 1 | 758.999002 | W | S | True |
Query on dummy dataset
Average and number of rows with smartnoise-sql library on remote dummy
[6]:
# Average Age
QUERY = "SELECT COUNT(*) AS nb_passengers, \
AVG(Age) AS avg_age \
FROM df"
[7]:
# On the remote server dummy dataframe
dummy_res = client.smartnoise_sql.query(
query = QUERY,
epsilon = 100.0, # make sure to select high values of epsilon and delta to have small differences
delta = 2.0, # make sure to select high values of epsilon and delta to have small differences
dummy = True,
nb_rows = NB_ROWS,
seed = SEED
)
[8]:
print(f"Average age in remote dummy: {np.round(dummy_res.result.df['avg_age'][0], 2)} years old.")
print(f"Number of rows in remote dummy: {np.round(dummy_res.result.df['nb_passengers'][0], 2)}.")
Average age in remote dummy: 51.7 years old.
Number of rows in remote dummy: 199.
Get current budget
[9]:
client.get_initial_budget()
[9]:
InitialBudgetResponse(initial_epsilon=45.0, initial_delta=0.2)
[10]:
client.get_total_spent_budget()
[10]:
SpentBudgetResponse(total_spent_epsilon=0.0, total_spent_delta=0.0)
It will also be useful to know what the remaining budget is. Therefore, we call the function get_remaining_budget
. It just substarcts the total spent budget from the initial budget.
[11]:
client.get_remaining_budget()
[11]:
RemainingBudgetResponse(remaining_epsilon=45.0, remaining_delta=0.2)
As expected, for now the remaining budget is equal to the inital budget.
Estimate cost of a query
Another safeguard is the functionnality to estimate the cost of a query. As in OpenDP and SmartnoiseSQL, the budget that will by used by a query might be slightly different than what is asked by the user. The estimate cost
function returns the estimated real cost of any query.
Again, of course, this will not impact the user’s budget.
[12]:
EPSILON = 0.5
DELTA = 1e-4
[13]:
client.smartnoise_sql.cost(
query = QUERY,
epsilon = EPSILON,
delta = DELTA
)
[13]:
CostResponse(epsilon=1.5, delta=0.00014999500000001387)
Query on real private dataset with smartnoise-sql.
[14]:
client.get_remaining_budget()
[14]:
RemainingBudgetResponse(remaining_epsilon=45.0, remaining_delta=0.2)
[15]:
response = client.smartnoise_sql.query(
query = QUERY,
epsilon = EPSILON,
delta = DELTA,
dummy = False # Optionnal
)
[ ]:
[16]:
nb_passengers = response.result.df['nb_passengers'].iloc[0]
print(f"Number of passengers in real data: {nb_passengers}.")
avg_age = np.round(response.result.df['avg_age'].iloc[0], 2)
print(f"Average age in real data: {avg_age}.")
Number of passengers in real data: 887.
Average age in real data: 29.34.
After each query on the real dataset, the budget informations are also returned to the researcher. It is possible possible to check the remaining budget again afterwards:
[17]:
client.get_remaining_budget()
[17]:
RemainingBudgetResponse(remaining_epsilon=43.5, remaining_delta=0.199850005)
As can be seen in get_total_spent_budget()
, it is the budget estimated with estimate_smartnoise_sql_cost()
that was spent.
[18]:
client.get_total_spent_budget()
[18]:
SpentBudgetResponse(total_spent_epsilon=1.5, total_spent_delta=0.00014999500000001387)
Step 4: Titanic statistics with opendp
[19]:
import opendp as dp
import opendp.transformations as trans
import opendp.measurements as meas
Confidence intervals for age over the whole population
[20]:
titanic_metadata
[20]:
{'max_ids': 1,
'rows': 887,
'row_privacy': True,
'censor_dims': False,
'columns': {'Pclass': {'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'int',
'precision': 32,
'lower': 1,
'upper': 3},
'Name': {'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'string'},
'Sex': {'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'string',
'cardinality': 2,
'categories': ['male', 'female']},
'Age': {'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'float',
'precision': 64,
'lower': 0.1,
'upper': 100.0},
'SibSp': {'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'int',
'precision': 32,
'lower': 0,
'upper': 10},
'Parch': {'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'int',
'precision': 32,
'lower': 0,
'upper': 10},
'Ticket': {'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'string'},
'Fare': {'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'float',
'precision': 64,
'lower': 0.0,
'upper': 1000.0},
'Cabin': {'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'string'},
'Embarked': {'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'string',
'cardinality': 3,
'categories': ['C', 'Q', 'S']},
'Survived': {'private_id': False,
'nullable': False,
'max_partition_length': None,
'max_influenced_partitions': None,
'max_partition_contributions': None,
'type': 'boolean'}}}
[21]:
columns = ["PassengerId", "Pclass", "Name", "Sex", "Age", "SibSp", "Parch"]
[22]:
age_min = titanic_metadata['columns']['Age']['lower']
age_max = titanic_metadata['columns']['Age']['upper']
age_min, age_max
[22]:
(0.1, 100.0)
[23]:
age_transformation_pipeline = (
trans.make_split_dataframe(separator=",", col_names=columns) >>
trans.make_select_column(key="Age", TOA=str) >>
trans.then_cast_default(TOA=float) >>
trans.then_clamp(bounds=(age_min, age_max)) >>
trans.then_resize(size=nb_passengers.tolist(), constant=avg_age) >>
trans.then_variance()
)
[24]:
# Expect to fail !!!
client.opendp.query(
opendp_pipeline = age_transformation_pipeline,
dummy=True
)
---------------------------------------------------------------------------
InvalidQueryException Traceback (most recent call last)
Cell In[24], line 2
1 # Expect to fail !!!
----> 2 client.opendp.query(
3 opendp_pipeline = age_transformation_pipeline,
4 dummy=True
5 )
File /code/lomas_client/libraries/opendp.py:105, in OpenDPClient.query(self, opendp_pipeline, fixed_delta, dummy, nb_rows, seed)
102 body = request_model.model_validate(body_dict)
103 res = self.http_client.post(endpoint, body)
--> 105 return validate_model_response(res, QueryResponse)
File /code/lomas_client/utils.py:83, in validate_model_response(response, response_model)
80 r_model = response_model.model_validate_json(data)
81 return r_model
---> 83 raise_error(response)
84 return None
File /code/lomas_client/utils.py:26, in raise_error(response)
24 error_message = response.json()
25 if response.status_code == status.HTTP_400_BAD_REQUEST:
---> 26 raise InvalidQueryException(error_message["InvalidQueryException"])
27 if response.status_code == status.HTTP_422_UNPROCESSABLE_ENTITY:
28 raise ExternalLibraryException(
29 error_message["library"], error_message["ExternalLibraryException"]
30 )
InvalidQueryException: The pipeline provided is not a measurement. It cannot be processed in this server.
This is because the server will only allow measurement pipeline with differentially private results. We add Laplacian noise to the pipeline and should be able to instantiate the pipeline.
[26]:
var_age_transformation_pipeline = (
age_transformation_pipeline >>
meas.then_laplace(scale=5.0)
)
Now that there is a measurement, one is able to apply the pipeline on the dummy dataset of the server.
[27]:
dummy_var_res = client.opendp.query(
opendp_pipeline = var_age_transformation_pipeline,
dummy=True
)
print(f"Dummy result for variance: {np.round(dummy_var_res.result.value, 2)}")
Dummy result for variance: 59.75
With opendp, the function estimate_opendp_cost
is particularly useful to estimate the used epsilon
and delta
based on the scale
value.
[28]:
cost_res = client.opendp.cost(
opendp_pipeline = var_age_transformation_pipeline
)
cost_res
[28]:
CostResponse(epsilon=2.2502841037292076, delta=0.0)
One can now execute the query on the real dataset.
[29]:
var_res = client.opendp.query(
opendp_pipeline = var_age_transformation_pipeline,
)
[30]:
print(f"Number of passengers: {nb_passengers} (from previous smartnoise-sql query).")
print(f"Average age: {np.round(avg_age, 2)} (from previous smartnoise-sql query).")
var_age = var_res.result.value
print(f"Variance of age: {np.round(var_age, 3)} (from opendp query).")
Number of passengers: 887 (from previous smartnoise-sql query).
Average age: 29.34 (from previous smartnoise-sql query).
Variance of age: 182.132 (from opendp query).
[31]:
# Get standard error
standard_error = np.sqrt(var_age/nb_passengers)
print(f"Standard error of age: {np.round(standard_error, 2)}.")
Standard error of age: 0.45.
[32]:
# Compute the 95% confidence interval
ZSCORE = 1.96
lower_bound = np.round(avg_age - ZSCORE*standard_error, 2)
upper_bound = np.round(avg_age + ZSCORE*standard_error, 2)
print(f"The 95% confidence interval of the age of all passengers is [{lower_bound}, {upper_bound}].")
The 95% confidence interval of the age of all passengers is [28.45, 30.23].
[ ]: