Senior Projects 2016
Xavier Boesken - Geometry of Cubic Polynomials, Advisor Dr. Danny Otero
Imagine a sphere floating in 3-space. By inscribing one of its great circles within an equilateral triangle, we can use the linear projection in map to the (x,y)-plane, viewed here as the complex plane z=x+iy, to project the vertices of the equilateral triangle onto the roots of a given cubic polynomial p(z). This discovery allows us to prove Marden's Theorem: the roots of the derivative p'(z) are the foci of the ellipse inscribed in and tangent to the midpoints of the triangle in the complex place determined by the roots of the polynomial. It also sheds light on Cardano's formula for finding the roots of the cubic p(z).
Sarah Davidson - Benchmark Dose Modeling with Covariates for Nanomaterials, Advisors Dr. Max Buot, Dr. Eileen Kuempel, Mr. Randall Smith, Mr. Nathan Drew
In the last decade, the use of engineered nanomaterials (ENMs) such as titanium dioxide (TiO2), carbon nanotubes (CNTs), carbon nanofibers (CNFs), as well as a variety of other materials have become increasingly popular in commerce because of their many beneficial properties (e.g. ability to manufacture products that are lighter, stronger, and/or more compact). However, according to the National Institute of Occupational Safety and Health, with the development of new nanotechnology it is prudent to ensure the health and safety of workers who are producing or using these materials at the forefront. For many ENMs, occupational exposure limits (OELs) are not available and the OELs developed for microscale materials may not be adequate for ENMs. In the absence of human data, rodent assays are often used to find a dose estimate which can then be used as a point of departure (POD) to extrapolate to humans. Some bioassays report summary statistics, which can be used to determine benchmark dose (BMD) estimates - the dose that corresponds to a specified level of increased response called a benchmark response or BMR [4]. Pooling data across studies with a small number of dose groups (as in many of the studies in this dataset) provides a more robust dataset by increasing the sample size, although also adding variability across different experimental designs (i.e. species, strain, gender). Thus, the aim of this project was to examine the influence of material type on the dose-response relationship using statistical regression modeling in R (statistical software) since the EPA's Benchmark Dose Software (BMDS) does not allow for covariates, and building upon these regression models by adding covariates to account for experimental design features which add variability that may obscure these relationships.
Matthew Kehling - Statistical Cinema, Advisor Dr. Max Buot
Have you ever wondered what determines the length of a movie? What about the rating that movie gets on popular movie rating sites like Rottentomatoes.com or IMDb.com? In this paper, we explore a large movie data set and examine some relationships between the length of the movie and various attributes, such as: year released, genre, MPAA rating, and budget. We also apply various correlation measures (Pearson, Spearman, and Kendall, respectively) to audience ratings between two popular movie rating sites, Rottentomatoes.com and IMDb.com. In addition, we also describe the technique of web scraping for obtaining data from online sources, and we implement a bootstrapping approach to estimate the standard errors of the correlation statistics. So, please, sit back, relax, and enjoy the show...
Annie McClellan - The Optimal Strategy of Blackjack, Advisor Dr. Richard Pulskamp
The casino game of blackjack is described and outlined by the playing strategies of hitting, standing, splitting, and/or doubling down. We look at "The Optimum Strategy in Blackjack" by Roger Baldwin, et al. (1956), and his specific expectation formulas. The equations are for the expectation of the player with the strategy being to maximize the players gain for each combination of dealer up card and player initial hand. These equations are implemented in Maple assuming an in nite deck of cards, and linear algebra modeling utilizes a Markov chain to represent the transitions from one hand to another for both the dealer and the player. The resulting decision rules are compared to his, which were constructed assuming a single deck. We find that the simulations of Baldwins notions are largely identical to his results. Specific discrepancies are accounted for and discussed.
Margaret O'Toole - Mathematics Anxiety and Pre-service Teachers, Advisor Dr. Carla Gerberry
Mathematics anxiety has been a topic of interest since the 1950s. It has been shown that apprehensive elementary mathematics teachers unintentionally transfer anxiety to students in their classroom. In this study we assessed the change in 34 pre-service teacher's anxiety, self-efficacy and perception of ability during and after a content-specific mathematics course. This study used a mixed method approach for analyze data. The results suggests that anxiety decreased over the semester and perception of ability and self-efficacy increased. The levels of confidence with the material were recorded before and after three exams in the course. The pre-service teacher's change in levels of confidence following the exams rose from an average of 2.98 to a 4.30 level out of 5. We chose three students from the participants who gave different pictures of their feelings towards mathematics to investigate in detail.
Mallory Smarto - Cryptographic Hashing Functions, Advisor Dr. Dena Morton
Cryptographic hash functions are a key aspect in the study of cryptology. They allow for encryption and decryption of information that have many real life applications including current and future security. If two distinct values are "hashed" to the same value, we have a collision. The purpose of this paper is to explore necessary characteristics of hash functions, gain a fundamental understanding of them, and consider the implications of collisions and protecting against them and adversaries. By utilizing a method based off of Lenstra in order to develop a collision in X.509 certificates we will examine how this is problematic.
Caitlin Snyder - The Optimization behind Support Vector Machines and an Application in Handwriting Recognition, Advisors Dr. Minnie Catral and Dr. Liz Johnson
Support Vector Machines(SVMs) are a unique tool for classification of data and are used for solving real world problems such as image classification and text analysis. This project explores the underlying mathematics optimization problem behind the algorithm in a Support Vector Machine. The original minimization problem is described and an equivalent maximization formulation is derived. Various two and three dimensional examples are given to illustrate how the optimization gives a useable result with both linearly and nonlinearly separable data. Finally, this tool is applied in a handwriting character recognition problem. Popular SVM kernels are explored and their respective accuracy percentages in identifying between characters are compared.