-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature to automate handling of replicate data #310
Comments
Hi Eva, thanks, I'm happy to work on this, probably this month. I would probably add some method that can be used like petab_problem_averaged = copy.deepcopy(petab_problem)
petab_problem_averaged.measurement_df = petab.measurements.average_from_replicates(
petab_problem.measurement_df,
average: str = "mean",
noise: str = "standard_deviation",
) with some warning that this will change the objective function. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi all!
I am wondering about a specific detail of setting up a parameter estimation problem and how it could be automated with petab. It is about how to make use of replicate data in the parameter estimation problem.
If replicates are available, I could think of at least 2 distinct scenarios of how to deal with this data:
Use individual replicates to calculate their mean and standard error of the mean, at each time point. And then use those as the data points and time point specific noise parameter values of the error model, respectively, when evaluating the objective function for parameter estimation.
Calculate the standard deviation of the individual replicates (for each time point) and use it as an error model parameter while using the individual replicates as individual data points when evaluating the objective function.
It would be great if petab could provide a feature to do this and choose between the different options.
I discussed this a bit with Dilan already and he asked me to assign him to the issue, but I don’t know how to do this. 😊
Thanks, everyone!
The text was updated successfully, but these errors were encountered: