===== Teaching ===== **BIOINF 2119 Probabilistic Methods in Artificial Intelligence** (Onsite course)\\ Taught every spring, 2010 - 2017 This course introduces fundamental concepts and methods in artificial intelligence that are applicable to problems in biomedicine. This course is designed for students who do not necessarily have a background in computer science. The course provides the foundations in artificial intelligence methods including search (breadth-first search, depth-first search, greedy search, etc), probabilistic knowledge representation and reasoning (Bayesian networks: model, independencies, semantics, parameter estimation and inference), decision theory, and machine learning (regression, neural networks, classification trees, support vector machines, Markov models and hidden Markov models). **Resources and pointers** * [[http://www.stat.wisc.edu/~ifischer/calculus.pdf|Basic calculus refresher]] * [[http://web.eecs.umich.edu/~ocj/courses/autorob/autorob_05_linear_refresh.pdf|Basic linear algebra refresher]] * [[http://parrt.cs.usfca.edu/doc/matrix-calculus/index.html|Matrix calculus for deep learning]] * [[https://www.desmos.com/calculator/kxz6lzszf9|Logistic regression loss functions]] * [[http://www.ccs.neu.edu/home/vip/teach/MLcourse/2_GD_REG_pton_NN/lecture_notes/logistic_regression_loss_function/logistic_regression_loss.pdf|Logistic regression with different loss functions]] ===== Writing ===== Four steps to writing papers in 6 weeks: * (1 week) Outline: Create section headings with bullet points of topics to be covered in each section * (2 weeks) Fast writing: For each bullet point write quickly ignoring typos, citations, spelling, or grammar * (1 week) Break from writing * (2 weeks) Slow revising: Remove unnecessary information, rearrange for flow, review and adjust paragraphs, make several passes fixing one thing each time **Resources and pointers** * [[https://www.insidehighered.com/blogs/gradhacker/how-develop-strategic-writing-plan|How to Develop a Strategic Writing Plan]] * [[https://islandeditions.files.wordpress.com/2014/10/fstwritingslw-rev_handouts.pdf|Practice Fast Writing & Slow Editing]] * [[https://unstick.me/write-fast-edit-slow/|Write Fast, Edit Slow]] ===== Software ===== The **Informative Bayesian Model Selection (IBMS)** is a computationally efficient method that can be applied to genome-wide data to detect both SNP-SNP interactions and interactions between two groups of SNPs (e.g., a group may consist of SNPs that map to a gene). The software for IBMS is available for [[http://lbb.ut.ac.ir/Download/LBBsoft/IBMS/|download]]. The paper describing this method is [[http://dx.doi.org/10.1039/C4MB00123K|Informative Bayesian Model Selection: A method for identifying interactions in genome-wide data]] and is published in Molecular BioSystems. The **backward elimination algorithm** uses a kernel-based conditional dependence measure to identify the Markov blanket in a fully multivariate fashion. The software for the backward elimination algorithm is available for download at the open source code repository [[https://github.com/ericstrobl|GitHub]]. The paper describing this algorithm is [[http://arxiv-web3.library.cornell.edu/abs/1402.0108|Markov blanket ranking using kernel-based conditional dependence measures]] and is published in the proceedings of NIPS 2013 Workshop on Causality: Large-scale Experiment Design and Inference of Causal Mechanisms. The **deep multiple kernel learning algorithm** tunes a deep multiple kernel net by alternating optimization with the span bound. It is an attempt to extend deep learning to small sample sizes. The software for the deep multiple kernel learning algorithm is available for download at the open source code repository [[https://github.com/ericstrobl/deepMKL|GitHub]]. The paper describing this algorithm is [[http://arxiv.org/abs/1310.3101|Deep multiple kernel learning]] and is published in the proceedings of the IEEE 12th International Conference on Machine Learning and Applications (ICMLA 2013). The **Modular Relief Framework (MoRF)** can be used to develop novel variations of the Relief algorithm for application to genome-wide data for ranking of single nucleotide variants (SNVs). The software for MoRF is available for download at the open source code repository [[https://github.com/mattstokes42/MoRF|GitHub]]. The paper describing this algorithm is [[http://www.biodatamining.org/content/5/1/20|Application of a spatially-weighed Relief algorithm for ranking genetic predictors of disease]] and is published in BioData Mining. The **Model Averaged Naive Bayes (MANB)** is an algorithm that predicts patient outcomes from genome-wide data by efficiently model averaging over an exponential number of naive Bayes models. The software for MANB is available for [[http://www.dbmi.pitt.edu/cooper-lab-software|download]]. The paper describing this algorithm is [[http://jamia.oxfordjournals.org/content/18/4/370.long|The application of naive Bayes model averaging to predict Alzheimer's disease from genome-wide data]] and is published in JAMIA. ===== Presentations ===== **Artificial Intelligence in Medicine** - {{wiki:ai_in_medicine_2023.pdf|slides are here}} **Artificial Intelligence in Medicine** MSTP Workshop, 8 March 2017. The {{wiki:ai_in_medicine_2017.pdf|slides are here}} and the citations are below: * Friedman CP. [[https://www.ncbi.nlm.nih.gov/pubmed/19074294|A "fundamental theorem" of biomedical informatics]]. J Am Med Inform Assoc. 2009 Mar-Apr; 16(2): 169–170. * Berner ES, Webster GD, et al. [[https://www.ncbi.nlm.nih.gov/pubmed/8190157|Performance of four computer-based diagnostic systems]]. N Engl J Med. 1994 Jun 23;330(25):1792-6. * Henderson EJ, Rubin GP. [[https://www.ncbi.nlm.nih.gov/pubmed/23772310|The utility of an online diagnostic decision support system (Isabel) in general practice: a process evaluation]]. JRSM Short Rep. 2013 Apr 4;4(5):31. * Devarakonda, M. Watson cognitive computing for Electronic Medical Records. AMIA Joint Summits on Translational Science, Podium Presentation, 2016. * Raghavan, P, Patwardhan, S. Question answering on Electronic Medical Records. AMIA Joint Summits on Translational Science, Podium Presentation, 2016. * King AJ, Cooper GF, Hochheiser H, Clermont G, Visweswaran S. [[https://www.ncbi.nlm.nih.gov/pubmed/26958296|Development and preliminary evaluation of a prototype of a Learning Electronic Medical Record System]]. AMIA Annu Symp Proc. 2015 Nov 5;2015:1967-75. * Hauskrecht M, Batal I, Valko M, Visweswaran S, Cooper GF, Clermont G. [[https://www.ncbi.nlm.nih.gov/pubmed/22944172|Outlier detection for patient monitoring and alerting]]. J Biomed Inform. 2013 Feb;46(1):47-55. * Chen L, Dubrawski A, Clermont G, Hravnak M, Pinsky MR. [[https://www.ncbi.nlm.nih.gov/pubmed/26958283|Modelling risk of cardio-respiratory instability as a heterogeneous process]]. AMIA Annu Symp Proc. 2015 Nov 5;2015:1841-50. * Sebastiani P, Ramoni MF, et. al. [[https://www.ncbi.nlm.nih.gov/pubmed/15778708|Genetic dissection and prognostic modeling of overt stroke in sickle cell anemia]]. Nat Genet. 2005;37(4):435-440. * Wagner MM, Robinson JM, Tsui FC, Espino JU, Hogan WR. [[https://www.ncbi.nlm.nih.gov/pubmed/12807802|Design of a national retail data monitor for public health surveillance]]. J Am Med Inform Assoc. 2003 Sep-Oct;10(5):409-18. * Klembczyk, JJ, Jalalpour, M, et. al. [[https://www.ncbi.nlm.nih.gov/pubmed/27354313|Google flu trends spatial variability validated against Emergency Department influenza-related visits]]. J Med Internet Res. 2016 Jun; 18(6): e175. * Shah NH, LePendu P, et. al. [[https://www.ncbi.nlm.nih.gov/pubmed/26061035|Proton pump inhibitor usage and the risk of myocardial infarction in the general population]]. PLoS One. 2015 Jun 10;10(6):e0124653. * Wong HR et al. [[https://www.ncbi.nlm.nih.gov/pubmed/23025259|The pediatric sepsis biomarker risk model]]. Crit Care. 2012 Oct 1;16(5):R174. * Sachs K, Perez O, Pe'er D, Lauffenburger DA, Nolan GP. [[https://www.ncbi.nlm.nih.gov/pubmed/15845847|Causal protein-signaling networks derived from multiparameter single-cell data]]. Science. 2005 Apr 22;308(5721):523-9. * Strobl, EV, Zhang, K, Visweswaran, S. [[https://arxiv.org/abs/1702.03877|Approximate kernel-based conditional independence tests for fast non-parametric causal discovery]]. arXiv:1702.0387, 2017. * Hripcsak G, Ryan PB, et. al. [[https://www.ncbi.nlm.nih.gov/pubmed/27274072|Characterizing treatment pathways at scale using the OHDSI network]]. Proc Natl Acad Sci U S A. 2016 Jul 5;113(27):7329-36. * Boland MR, Shahn Z, Madigan D, Hripcsak G, Tatonetti NP. [[https://www.ncbi.nlm.nih.gov/pubmed/26041386|Birth month affects lifetime disease risk: a phenome-wide method]]. J Am Med Inform Assoc. 2015 Sep;22(5):1042-53. * Friedman CP, Wong AK, Blumenthal D. [[https://www.ncbi.nlm.nih.gov/pubmed/21068440|Achieving a nationwide learning health system]]. Sci Transl Med. 2010 Nov 10;2(57):57cm29. ===== Courses in Statistics and Data Science ===== Undergraduate calculus sequence:\\ Textbook: James Stewart, Essential Calculus, Early Transcendentals, 2nd Edition * MATH 0220 Analytic Geometry and Calculus 1 * MATH 0230 Analytic Geometry and Calculus 2 * MATH 0240 Analytic Geometry and Calculus 3 Undergraduate linear algebra: * MATH 0280 Introduction to Matrices and Linear Algebra * MATH 1180 Linear Algebra 1 Introductory probability and statistics:\\ Textbook: I. Miller and M. Miller, John E. Freund’s Mathematical Statistics with Applications, 8th Edition * STAT 1151 Introduction to Probability * STAT 1152 Introduction to Mathematical Statistics Intermediate probability and statistics:\\ Textbook: Morris H. DeGroot and Mark J. Schervish, Probability and Statistics, 4th Edition * STAT 1631 Intermediate Probability * STAT 1632 Intermediate Mathematical Statistics Advanced probability: * STAT 2711: Probability Theory 1 * STAT 2712: Probability Theory 2 Advanced statistics: * STAT 2131: Applied Statistical Methods 1 * STAT 2132: Applied Statistical Methods 2 Data science: * STAT 1261: Principles of Data Science * STAT 1361: Statistical Learning and Data Science Other topics: * STAT 1651 Bayesian Statistics * STAT 1661 Linear Regression * STAT 1662 Nonlinear Regression ===== Syllabi for Courses in Statistics and Data Science ===== **MATH 0220 Analytic Geometry and Calculus 1**\\ Textbook: [[https://pittcat.pitt.edu/cgi-bin/Pwebrecon.cgi?v1=1&hd=1,1&CallBrowse=1&SEQ=20171028123545&PID=b8ThldIENRLjESiyqs0WtjazXz&SID=2|James Stewart, Essential Calculus, Early Transcendentals, 2nd edition]] * FUNCTIONS AND LIMITS * Functions and Their Representations * A Catalog of Essential Functions * The Limit of a Function * Calculating Limits * Continuity * Limits Involving Infinity * DERIVATIVES * Derivatives and Rates of Change * The Derivative as a Function * Basic Differentiation Formulas * The Product and Quotient Rules * The Chain Rule * Implicit Differentiation * Related Rates * Linear Approximations and Differentials * INVERSE FUNCTIONS: Exponential, Logarithmic, and Inverse Trigonometric Functions * Exponential Functions * Inverse Functions and Logarithms * Derivatives of Logarithmic and Exponential Functions * Exponential Growth and Decay * Inverse Trigonometric Functions * Hyperbolic Functions * Indeterminate Forms and L’Hospital’s Rule * APPLICATIONS OF DIFFERENTIATION * Maximum and Minimum Values * The Mean Value Theorem * Derivatives and the Shapes of Graphs * Curve Sketching * Optimization Problems * Newton’s Method * Antiderivatives * INTEGRALS * Areas and Distances * The Definite Integral * Evaluating Definite Integrals * The Fundamental Theorem of Calculus * The Substitution Rule * TECHNIQUES OF INTEGRATION * Integration by Parts * Trigonometric Integrals and Substitutions **MATH 0230 Analytic Geometry and Calculus 2**\\ Textbook: [[https://pittcat.pitt.edu/cgi-bin/Pwebrecon.cgi?v1=1&hd=1,1&CallBrowse=1&SEQ=20171028123545&PID=b8ThldIENRLjESiyqs0WtjazXz&SID=2|James Stewart, Essential Calculus, Early Transcendentals, 2nd edition]] * TECHNIQUES OF INTEGRATION * Integration by Parts * Trigonometric Integrals and Substitutions * Partial Fractions * Integration with Tables and Computer Algebra Systems * Approximate Integration * Improper Integrals * APPLICATIONS OF INTEGRATION * Areas Between Curves * Volumes * Volumes by Cylindrical Shells * Arc Length * Area of a Surface of Revolution * Applications to Physics and Engineering * Differential Equations * SERIES * Sequences * Series * The Integral and Comparison Tests * Other Convergence Tests * Power Series * Representing Functions as Power Series * Taylor and Maclaurin Series * Applications of Taylor Polynomials * PARAMETRIC EQUATIONS AND POLAR COORDINATES * Parametric Curves * Calculus with Parametric Curves * Polar Coordinates * Areas and Lengths in Polar Coordinates * Conic Sections in Polar Coordinates **MATH 0240 Analytic Geometry and Calculus 3**\\ Textbook: [[https://pittcat.pitt.edu/cgi-bin/Pwebrecon.cgi?v1=1&hd=1,1&CallBrowse=1&SEQ=20171028123545&PID=b8ThldIENRLjESiyqs0WtjazXz&SID=2|James Stewart, Essential Calculus, Early Transcendentals, 2nd edition]] * VECTORS AND THE GEOMETRY OF SPACE * Three-Dimensional Coordinate Systems * Vectors * The Dot Product * The Cross Product * Equations of Lines and Planes * Cylinders and Quadric Surfaces * Vector Functions and Space Curves * Arc Length and Curvature * Motion in Space: Velocity and Acceleration * PARTIAL DERIVATIVES * Functions of Several Variables * Limits and Continuity * Partial Derivatives * Tangent Planes and Linear Approximations * The Chain Rule * Directional Derivatives and the Gradient Vector * Maximum and Minimum Values * Lagrange Multipliers * MULTIPLE INTEGRALS * Double Integrals over Rectangles * Double Integrals over General Regions * Double Integrals in Polar Coordinates * Applications of Double Integrals * Triple Integrals * Triple Integrals in Cylindrical Coordinates * Triple Integrals in Spherical Coordinates * Change of Variables in Multiple Integrals * VECTOR CALCULUS * Vector Fields * Line Integrals * The Fundamental Theorem for Line Integrals * Green’s Theorem * Curl and Divergence * Parametric Surfaces and Their Areas * Surface Integrals * Stokes’ Theorem * The Divergence Theorem **MATH 1180 Linear Algebra 1**\\ Textbook: D. Poole, Linear Algebra: a Modern Introduction, 4th edition * 1. VECTORS * Introduction: The Racetrack Game. The Geometry and Algebra of Vectors. Length and Angle: The Dot Product. Exploration: Vectors and Geometry. Lines and Planes. Exploration: The Cross Product. Writing Project: Origins of the Dot Product and the Cross Product. Applications. * 2. SYSTEMS OF LINEAR EQUATIONS * Introduction: Triviality. Introduction to Systems of Linear Equations. Direct Methods for Solving Linear Systems. Writing Project: A History of Gaussian Elimination. Explorations: Lies My Computer Told Me; Partial Pivoting; Counting Operations: An Introduction to the Analysis of Algorithms. Spanning Sets and Linear Independence. Applications. Vignette: The Global Positioning System. Iterative Methods for Solving Linear Systems. * 3. MATRICES * Introduction: Matrices in Action. Matrix Operations. Matrix Algebra. The Inverse of a Matrix. The LU Factorization. Subspaces, Basis, Dimension, and Rank. Introduction to Linear Transformations. Vignette: Robotics. Applications. * 4. EIGENVALUES AND EIGENVECTORS * Introduction: A Dynamical System on Graphs. Introduction to Eigenvalues and Eigenvectors. Determinants. Writing Project: Which Came First-the Matrix or the Determinant? Vignette: Lewis Carroll's Condensation Method. Exploration: Geometric Applications of Determinants. Eigenvalues and Eigenvectors of n x n Matrices. Writing Project: The History of Eigenvalues. Similarity and Diagonalization. Iterative Methods for Computing Eigenvalues. Applications and the Perron-Frobenius Theorem. Vignette: Ranking Sports Teams and Searching the Internet. * 5. ORTHOGONALITY * Introduction: Shadows on a Wall. Orthogonality in Rn. Orthogonal Complements and Orthogonal Projections. The Gram-Schmidt Process and the QR Factorization. Explorations: The Modified QR Factorization; Approximating Eigenvalues with the QR Algorithm. Orthogonal Diagonalization of Symmetric Matrices. Applications. * 6. VECTOR SPACES * Introduction: Fibonacci in (Vector) Space. Vector Spaces and Subspaces. Linear Independence, Basis, and Dimension. Writing Project: The Rise of Vector Spaces. Exploration: Magic Squares. Change of Basis. Linear Transformations. The Kernel and Range of a Linear Transformation. The Matrix of a Linear Transformation. Exploration: Tilings, Lattices and the Crystallographic Restriction. Applications. * 7. DISTANCE AND APPROXIMATION * Introduction: Taxicab Geometry. Inner Product Spaces. Explorations: Vectors and Matrices with Complex Entries; Geometric Inequalities and Optimization Problems. Norms and Distance Functions. Least Squares Approximation. The Singular Value Decomposition. Vignette: Digital Image Compression. Applications. **STAT 1151 Introduction to Probability**\\ Textbook: [[http://pittcat.hsls.pitt.edu/cgi-bin/Pwebrecon.cgi?v1=1&hd=1,1&CallBrowse=1&SEQ=20171028123255&PID=7NgPDWqylo06u8fpR2lZNB3GX67C&SID=2|I. Miller and M. Miller, John E. Freund’s Mathematical Statistics with Applications, 8th Edition]] * 1. Introduction * 1.1 Introduction * 1.2 Combinatorial Methods * 1.3 Binomial Coefficients * 1.4 The Theory in Practice * 2. Probability * 2.1 Introduction * 2.2 Sample Spaces * 2.3 Events * 2.4 The Probability of an Event * 2.5 Some Rules of Probability * 2.6 Conditional Probability * 2.7 Independent Events * 2.8 Bayes’ Theorem * 2.9 The Theory in Practice * 3. Probability Distributions and Probability Densities * 3.1 Random Variables * 3.2 Probability Distributions * 3.3 Continuous Random Variables * 3.4 Probability Density Functions * 3.5 Multivariate Distributions * 3.6 Marginal Distributions * 3.7 Conditional Distributions * 3.8 The Theory in Practice * 4. Mathematical Expectation * 4.1 Introduction * 4.2 The Expected Value of a Random Variable * 4.3 Moments * 4.4 Chebyshev’s Theorem * 4.5 Moment-Generating Functions * 4.6 Product Moments * 4.7 Moments of Linear Combinations of Random Variables * 4.8 Conditional Expectations * 4.9 The Theory in Practice * 5. Special Probability Distributions * 5.1 Introduction * 5.2 The Discrete Uniform Distribution * 5.3 The Bernoulli Distribution * 5.4 The Binomial Distribution * 5.5 The Negative Binomial and Geometric Distributions * 5.6 The Hypergeometric Distribution * 5.7 The Poisson Distribution * 5.8 The Multinomial Distribution * 5.9 The Multivariate Multinomial Distribution * 5.10 The Theory in Practice * 6. Special Probability Densities * 6.1 Introduction * 6.2 The Uniform Distribution * 6.3 The Gamma, Exponential, and Chi-Square Distributions * 6.4 The Beta Distribution * 6.5 The Normal Distribution * 6.6 The Normal approximation to the Binomial Distribution * 6.7 The Bivariate Normal Distribution * 6.8 The Theory in Practice * 7. Functions of Random Variables * 7.1 Introduction * 7.2 Distribution Function Technique * 7.3 Transformation Technique: One Variable * 7.4 Transformation Technique: Several Variables * 7.5 Moment-Generating Function Technique * 7.6 The Theory in Application * 8. Sampling Distributions * 8.1 Introduction * 8.2 The Sampling Distribution of the Mean * 8.3 The Sampling Distribution of the Mean: Finite Populations * 8.4 The Chi-Square Distribution * 8.5 The t Distribution * 8.6 The F Distribution * 8.7 Order Statistics * 8.8 The Theory in Practice **STAT 1152 Introduction to Mathematical Statistics**\\ Textbook: [[http://pittcat.hsls.pitt.edu/cgi-bin/Pwebrecon.cgi?v1=1&hd=1,1&CallBrowse=1&SEQ=20171028123255&PID=7NgPDWqylo06u8fpR2lZNB3GX67C&SID=2|I. Miller and M. Miller, John E. Freund’s Mathematical Statistics with Applications, 8th Edition]] * 9. Decision Theory * 9.1 Introduction * 9.2 The Theory of Games * 9.3 Statistical Games * 9.4 Decision Criteria * 9.5 Minimax Criterion * 9.6 The Bayes Criterion * 9.7 The Theory in Practice * 10. Point Estimation * 10.1 Introduction * 10.2 Unbiased Estimators * 10.3 Efficiency * 10.4 Consistency * 10.5 Sufficiency * 10.6 Robustness * 10.7 The Method of Moments * 10.8 The Method of Maximum Likelihood * 10.9 Bayesian Estimation * 10.10 The Theory in Practice * 11. Interval Estimation * 11.1 Introduction * 11.2 The Estimation of Means * 11.3 The Estimation of Differences Between Means * 11.4 The Estimation of Proportions * 11.5 The Estimation of Differences Between Proportions * 11.6 The Estimation of Variances * 11.7 The Estimation of the Ratio of Two Variances * 11.8 The Theory in Practice * 12. Hypothesis Testing * 12.1 Introduction * 12.2 Testing a Statistical Hypothesis * 12.3 Losses and Risks * 12.4 The Neyman-Pearson Lemma * 12.5 The Power Function of a Test * 12.6 Likelihood Ratio Tests * 12.7 The Theory in Practice * 13. Tests of Hypotheses Involving Means, Variances, and Proportions * 13.1 Introduction * 13.2 Tests Concerning Means * 13.3 Tests Concerning Differences Between Means * 13.4 Tests Concerning Variances * 13.5 Tests Concerning Proportions * 13.6 Tests Concerning Differences Among k Proportions * 13.7 The Analysis of an r x c Table * 13.8 Goodness of Fit * 13.9 The Theory in Practice * 14. Regression and Correlation * 14.1 Introduction * 14.2 Linear Regression * 14.3 The Method of Least Squares * 14.4 Normal Regression Analysis * 14.5 Normal Correlation Analysis * 14.6 Multiple Linear Regression * 14.7 Multiple Linear Regression (Matrix Notation) * 14.8 The Theory in Practice * 15. Design and Analysis of Experiments * 15.1 Introduction * 15.2 One-Way Designs * 15.3 Randomized-Block Designs * 15.4 Factorial Experiments * 15.5 Multiple comparisons * 15.6 Other Experimental Designs * 15.7 The Theory in Practice * 16. Nonparametric Tests * 16.1 Introduction * 16.2 The Sign test * 16.3 The Signed-Rank Test * 16.4 Rank-Sum Tests: The U Test * 16.5 Rank-Sum Tests: The H Test * 16.6 Tests Based on Runs * 16.7 The Rank Correlation Coefficient * 16.8 The Theory in Practice