This is an old revision of the document!


Teaching

BIOINF 2119 Probabilistic Methods in Artificial Intelligence (Onsite course)
Taught every spring, 2010 - 2017

This course introduces fundamental concepts and methods in artificial intelligence that are applicable to problems in biomedicine. This course is designed for students who do not necessarily have a background in computer science. The course provides the foundations in artificial intelligence methods including search (breadth-first search, depth-first search, greedy search, etc), probabilistic knowledge representation and reasoning (Bayesian networks: model, independencies, semantics, parameter estimation and inference), decision theory, and machine learning (regression, neural networks, classification trees, support vector machines, Markov models and hidden Markov models).

Resources and pointers

Writing

Four steps to writing papers:

  • (1 week) Outline: Create section headings with bullet points of topics to be covered in each section
  • (2 weeks) Fast writing: For each each bullet point write quickly ignoring typos, citations, spelling, or grammar
  • (1 week) Break from writing
  • (2 weeks) Slow revising: Remove unnecessary information, rearrange for flow, review and adjust paragraphs, make several passes fixing one thing each time

Resources and pointers

Software

The Informative Bayesian Model Selection (IBMS) is a computationally efficient method that can be applied to genome-wide data to detect both SNP-SNP interactions and interactions between two groups of SNPs (e.g., a group may consist of SNPs that map to a gene). The software for IBMS is available for download. The paper describing this method is Informative Bayesian Model Selection: A method for identifying interactions in genome-wide data and is published in Molecular BioSystems.

The backward elimination algorithm uses a kernel-based conditional dependence measure to identify the Markov blanket in a fully multivariate fashion. The software for the backward elimination algorithm is available for download at the open source code repository GitHub. The paper describing this algorithm is Markov blanket ranking using kernel-based conditional dependence measures and is published in the proceedings of NIPS 2013 Workshop on Causality: Large-scale Experiment Design and Inference of Causal Mechanisms.

The deep multiple kernel learning algorithm tunes a deep multiple kernel net by alternating optimization with the span bound. It is an attempt to extend deep learning to small sample sizes. The software for the deep multiple kernel learning algorithm is available for download at the open source code repository GitHub. The paper describing this algorithm is Deep multiple kernel learning and is published in the proceedings of the IEEE 12th International Conference on Machine Learning and Applications (ICMLA 2013).

The Modular Relief Framework (MoRF) can be used to develop novel variations of the Relief algorithm for application to genome-wide data for ranking of single nucleotide variants (SNVs). The software for MoRF is available for download at the open source code repository GitHub. The paper describing this algorithm is Application of a spatially-weighed Relief algorithm for ranking genetic predictors of disease and is published in BioData Mining.

The Model Averaged Naive Bayes (MANB) is an algorithm that predicts patient outcomes from genome-wide data by efficiently model averaging over an exponential number of naive Bayes models. The software for MANB is available for download. The paper describing this algorithm is The application of naive Bayes model averaging to predict Alzheimer's disease from genome-wide data and is published in JAMIA.

Presentations

Artificial Intelligence in Medicine MSTP Workshop, 8 March 2017. The slides are here and the citations are below:

Courses in Statistics and Data Science

Undergraduate calculus sequence:
Textbook: James Stewart, Essential Calculus, Early Transcendentals, 2nd Edition

  • MATH 0220 Analytic Geometry and Calculus 1
  • MATH 0230 Analytic Geometry and Calculus 2
  • MATH 0240 Analytic Geometry and Calculus 3

Undergraduate linear algebra:

  • MATH 0280 Introduction to Matrices and Linear Algebra
  • MATH 1180 Linear Algebra 1

Introductory probability and statistics:
Textbook: I. Miller and M. Miller, John E. Freund’s Mathematical Statistics with Applications, 8th Edition

  • STAT 1151 Introduction to Probability
  • STAT 1152 Introduction to Mathematical Statistics

Intermediate probability and statistics:
Textbook: Morris H. DeGroot and Mark J. Schervish, Probability and Statistics, 4th Edition

  • STAT 1631 Intermediate Probability
  • STAT 1632 Intermediate Mathematical Statistics

Advanced probability:

  • STAT 2711: Probability Theory 1
  • STAT 2712: Probability Theory 2

Advanced statistics:

  • STAT 2131: Applied Statistical Methods 1
  • STAT 2132: Applied Statistical Methods 2

Data science:

  • STAT 1261: Principles of Data Science
  • STAT 1361: Statistical Learning and Data Science

Other topics:

  • STAT 1651 Bayesian Statistics
  • STAT 1661 Linear Regression
  • STAT 1662 Nonlinear Regression

Syllabi for Courses in Statistics and Data Science

MATH 0220 Analytic Geometry and Calculus 1
Textbook: James Stewart, Essential Calculus, Early Transcendentals, 2nd edition

  • FUNCTIONS AND LIMITS
    • Functions and Their Representations
    • A Catalog of Essential Functions
    • The Limit of a Function
    • Calculating Limits
    • Continuity
    • Limits Involving Infinity
  • DERIVATIVES
    • Derivatives and Rates of Change
    • The Derivative as a Function
    • Basic Differentiation Formulas
    • The Product and Quotient Rules
    • The Chain Rule
    • Implicit Differentiation
    • Related Rates
    • Linear Approximations and Differentials
  • INVERSE FUNCTIONS: Exponential, Logarithmic, and Inverse Trigonometric Functions
    • Exponential Functions
    • Inverse Functions and Logarithms
    • Derivatives of Logarithmic and Exponential Functions
    • Exponential Growth and Decay
    • Inverse Trigonometric Functions
    • Hyperbolic Functions
    • Indeterminate Forms and L’Hospital’s Rule
  • APPLICATIONS OF DIFFERENTIATION
    • Maximum and Minimum Values
    • The Mean Value Theorem
    • Derivatives and the Shapes of Graphs
    • Curve Sketching
    • Optimization Problems
    • Newton’s Method
    • Antiderivatives
  • INTEGRALS
    • Areas and Distances
    • The Definite Integral
    • Evaluating Definite Integrals
    • The Fundamental Theorem of Calculus
    • The Substitution Rule
  • TECHNIQUES OF INTEGRATION
    • Integration by Parts
    • Trigonometric Integrals and Substitutions

MATH 0230 Analytic Geometry and Calculus 2
Textbook: James Stewart, Essential Calculus, Early Transcendentals, 2nd edition

  • TECHNIQUES OF INTEGRATION
    • Integration by Parts
    • Trigonometric Integrals and Substitutions
    • Partial Fractions
    • Integration with Tables and Computer Algebra Systems
    • Approximate Integration
    • Improper Integrals
  • APPLICATIONS OF INTEGRATION
    • Areas Between Curves
    • Volumes
    • Volumes by Cylindrical Shells
    • Arc Length
    • Area of a Surface of Revolution
    • Applications to Physics and Engineering
    • Differential Equations
  • SERIES
    • Sequences
    • Series
    • The Integral and Comparison Tests
    • Other Convergence Tests
    • Power Series
    • Representing Functions as Power Series
    • Taylor and Maclaurin Series
    • Applications of Taylor Polynomials
  • PARAMETRIC EQUATIONS AND POLAR COORDINATES
    • Parametric Curves
    • Calculus with Parametric Curves
    • Polar Coordinates
    • Areas and Lengths in Polar Coordinates
    • Conic Sections in Polar Coordinates

MATH 0240 Analytic Geometry and Calculus 3
Textbook: James Stewart, Essential Calculus, Early Transcendentals, 2nd edition

  • VECTORS AND THE GEOMETRY OF SPACE
    • Three-Dimensional Coordinate Systems
    • Vectors
    • The Dot Product
    • The Cross Product
    • Equations of Lines and Planes
    • Cylinders and Quadric Surfaces
    • Vector Functions and Space Curves
    • Arc Length and Curvature
    • Motion in Space: Velocity and Acceleration
  • PARTIAL DERIVATIVES
    • Functions of Several Variables
    • Limits and Continuity
    • Partial Derivatives
    • Tangent Planes and Linear Approximations
    • The Chain Rule
    • Directional Derivatives and the Gradient Vector
    • Maximum and Minimum Values
    • Lagrange Multipliers
  • MULTIPLE INTEGRALS
    • Double Integrals over Rectangles
    • Double Integrals over General Regions
    • Double Integrals in Polar Coordinates
    • Applications of Double Integrals
    • Triple Integrals
    • Triple Integrals in Cylindrical Coordinates
    • Triple Integrals in Spherical Coordinates
    • Change of Variables in Multiple Integrals
  • VECTOR CALCULUS
    • Vector Fields
    • Line Integrals
    • The Fundamental Theorem for Line Integrals
    • Green’s Theorem
    • Curl and Divergence
    • Parametric Surfaces and Their Areas
    • Surface Integrals
    • Stokes’ Theorem
    • The Divergence Theorem

MATH 1180 Linear Algebra 1
Textbook: D. Poole, Linear Algebra: a Modern Introduction, 4th edition

  • 1. VECTORS
    • Introduction: The Racetrack Game. The Geometry and Algebra of Vectors. Length and Angle: The Dot Product. Exploration: Vectors and Geometry. Lines and Planes. Exploration: The Cross Product. Writing Project: Origins of the Dot Product and the Cross Product. Applications.
  • 2. SYSTEMS OF LINEAR EQUATIONS
    • Introduction: Triviality. Introduction to Systems of Linear Equations. Direct Methods for Solving Linear Systems. Writing Project: A History of Gaussian Elimination. Explorations: Lies My Computer Told Me; Partial Pivoting; Counting Operations: An Introduction to the Analysis of Algorithms. Spanning Sets and Linear Independence. Applications. Vignette: The Global Positioning System. Iterative Methods for Solving Linear Systems.
  • 3. MATRICES
    • Introduction: Matrices in Action. Matrix Operations. Matrix Algebra. The Inverse of a Matrix. The LU Factorization. Subspaces, Basis, Dimension, and Rank. Introduction to Linear Transformations. Vignette: Robotics. Applications.
  • 4. EIGENVALUES AND EIGENVECTORS
    • Introduction: A Dynamical System on Graphs. Introduction to Eigenvalues and Eigenvectors. Determinants. Writing Project: Which Came First-the Matrix or the Determinant? Vignette: Lewis Carroll's Condensation Method. Exploration: Geometric Applications of Determinants. Eigenvalues and Eigenvectors of n x n Matrices. Writing Project: The History of Eigenvalues. Similarity and Diagonalization. Iterative Methods for Computing Eigenvalues. Applications and the Perron-Frobenius Theorem. Vignette: Ranking Sports Teams and Searching the Internet.
  • 5. ORTHOGONALITY
    • Introduction: Shadows on a Wall. Orthogonality in Rn. Orthogonal Complements and Orthogonal Projections. The Gram-Schmidt Process and the QR Factorization. Explorations: The Modified QR Factorization; Approximating Eigenvalues with the QR Algorithm. Orthogonal Diagonalization of Symmetric Matrices. Applications.
  • 6. VECTOR SPACES
    • Introduction: Fibonacci in (Vector) Space. Vector Spaces and Subspaces. Linear Independence, Basis, and Dimension. Writing Project: The Rise of Vector Spaces. Exploration: Magic Squares. Change of Basis. Linear Transformations. The Kernel and Range of a Linear Transformation. The Matrix of a Linear Transformation. Exploration: Tilings, Lattices and the Crystallographic Restriction. Applications.
  • 7. DISTANCE AND APPROXIMATION
    • Introduction: Taxicab Geometry. Inner Product Spaces. Explorations: Vectors and Matrices with Complex Entries; Geometric Inequalities and Optimization Problems. Norms and Distance Functions. Least Squares Approximation. The Singular Value Decomposition. Vignette: Digital Image Compression. Applications.

STAT 1151 Introduction to Probability
Textbook: I. Miller and M. Miller, John E. Freund’s Mathematical Statistics with Applications, 8th Edition

  • 1. Introduction
    • 1.1 Introduction
    • 1.2 Combinatorial Methods
    • 1.3 Binomial Coefficients
    • 1.4 The Theory in Practice
  • 2. Probability
    • 2.1 Introduction
    • 2.2 Sample Spaces
    • 2.3 Events
    • 2.4 The Probability of an Event
    • 2.5 Some Rules of Probability
    • 2.6 Conditional Probability
    • 2.7 Independent Events
    • 2.8 Bayes’ Theorem
    • 2.9 The Theory in Practice
  • 3. Probability Distributions and Probability Densities
    • 3.1 Random Variables
    • 3.2 Probability Distributions
    • 3.3 Continuous Random Variables
    • 3.4 Probability Density Functions
    • 3.5 Multivariate Distributions
    • 3.6 Marginal Distributions
    • 3.7 Conditional Distributions
    • 3.8 The Theory in Practice
  • 4. Mathematical Expectation
    • 4.1 Introduction
    • 4.2 The Expected Value of a Random Variable
    • 4.3 Moments
    • 4.4 Chebyshev’s Theorem
    • 4.5 Moment-Generating Functions
    • 4.6 Product Moments
    • 4.7 Moments of Linear Combinations of Random Variables
    • 4.8 Conditional Expectations
    • 4.9 The Theory in Practice
  • 5. Special Probability Distributions
    • 5.1 Introduction
    • 5.2 The Discrete Uniform Distribution
    • 5.3 The Bernoulli Distribution
    • 5.4 The Binomial Distribution
    • 5.5 The Negative Binomial and Geometric Distributions
    • 5.6 The Hypergeometric Distribution
    • 5.7 The Poisson Distribution
    • 5.8 The Multinomial Distribution
    • 5.9 The Multivariate Multinomial Distribution
    • 5.10 The Theory in Practice
  • 6. Special Probability Densities
    • 6.1 Introduction
    • 6.2 The Uniform Distribution
    • 6.3 The Gamma, Exponential, and Chi-Square Distributions
    • 6.4 The Beta Distribution
    • 6.5 The Normal Distribution
    • 6.6 The Normal approximation to the Binomial Distribution
    • 6.7 The Bivariate Normal Distribution
    • 6.8 The Theory in Practice
  • 7. Functions of Random Variables
    • 7.1 Introduction
    • 7.2 Distribution Function Technique
    • 7.3 Transformation Technique: One Variable
    • 7.4 Transformation Technique: Several Variables
    • 7.5 Moment-Generating Function Technique
    • 7.6 The Theory in Application
  • 8. Sampling Distributions
    • 8.1 Introduction
    • 8.2 The Sampling Distribution of the Mean
    • 8.3 The Sampling Distribution of the Mean: Finite Populations
    • 8.4 The Chi-Square Distribution
    • 8.5 The t Distribution
    • 8.6 The F Distribution
    • 8.7 Order Statistics
    • 8.8 The Theory in Practice

STAT 1152 Introduction to Mathematical Statistics
Textbook: I. Miller and M. Miller, John E. Freund’s Mathematical Statistics with Applications, 8th Edition

  • 9. Decision Theory
    • 9.1 Introduction
    • 9.2 The Theory of Games
    • 9.3 Statistical Games
    • 9.4 Decision Criteria
    • 9.5 Minimax Criterion
    • 9.6 The Bayes Criterion
    • 9.7 The Theory in Practice
  • 10. Point Estimation
    • 10.1 Introduction
    • 10.2 Unbiased Estimators
    • 10.3 Efficiency
    • 10.4 Consistency
    • 10.5 Sufficiency
    • 10.6 Robustness
    • 10.7 The Method of Moments
    • 10.8 The Method of Maximum Likelihood
    • 10.9 Bayesian Estimation
    • 10.10 The Theory in Practice
  • 11. Interval Estimation
    • 11.1 Introduction
    • 11.2 The Estimation of Means
    • 11.3 The Estimation of Differences Between Means
    • 11.4 The Estimation of Proportions
    • 11.5 The Estimation of Differences Between Proportions
    • 11.6 The Estimation of Variances
    • 11.7 The Estimation of the Ratio of Two Variances
    • 11.8 The Theory in Practice
  • 12. Hypothesis Testing
    • 12.1 Introduction
    • 12.2 Testing a Statistical Hypothesis
    • 12.3 Losses and Risks
    • 12.4 The Neyman-Pearson Lemma
    • 12.5 The Power Function of a Test
    • 12.6 Likelihood Ratio Tests
    • 12.7 The Theory in Practice
  • 13. Tests of Hypotheses Involving Means, Variances, and Proportions
    • 13.1 Introduction
    • 13.2 Tests Concerning Means
    • 13.3 Tests Concerning Differences Between Means
    • 13.4 Tests Concerning Variances
    • 13.5 Tests Concerning Proportions
    • 13.6 Tests Concerning Differences Among k Proportions
    • 13.7 The Analysis of an r x c Table
    • 13.8 Goodness of Fit
    • 13.9 The Theory in Practice
  • 14. Regression and Correlation
    • 14.1 Introduction
    • 14.2 Linear Regression
    • 14.3 The Method of Least Squares
    • 14.4 Normal Regression Analysis
    • 14.5 Normal Correlation Analysis
    • 14.6 Multiple Linear Regression
    • 14.7 Multiple Linear Regression (Matrix Notation)
    • 14.8 The Theory in Practice
  • 15. Design and Analysis of Experiments
    • 15.1 Introduction
    • 15.2 One-Way Designs
    • 15.3 Randomized-Block Designs
    • 15.4 Factorial Experiments
    • 15.5 Multiple comparisons
    • 15.6 Other Experimental Designs
    • 15.7 The Theory in Practice
  • 16. Nonparametric Tests
    • 16.1 Introduction
    • 16.2 The Sign test
    • 16.3 The Signed-Rank Test
    • 16.4 Rank-Sum Tests: The U Test
    • 16.5 Rank-Sum Tests: The H Test
    • 16.6 Tests Based on Runs
    • 16.7 The Rank Correlation Coefficient
    • 16.8 The Theory in Practice