The 2014 ebola virus disease outbreak in Africa has, over the course of eight months, become one of the most deadly outbreaks in the disease’s history. The endemic has reached proportions of catastrophic standards, and has even begun to encroach upon American soil, such as, according to Haeyoun, in Dallas. (2). Although the United States is a different case from Africa in ability to control an outbreak, the fact remains that Ebola is a powerful adversary for which we must devise quick and effective methods of control. In order to do this, mathematicians such as Dr. Martcheva of the University of Florida have begun to analyze the trends of Ebola with the outbreak styled model, and it was with this that I became interested in pursuing research with Dr. Martcheva. Through a series of ordinary differential equations delineating the rate of change of susceptible, infectious, recovered and deceased individuals who have suffered from Ebola, it becomes possible to create a mathematical model capable of predicting the future of Ebola in Africa. Dr. Martcheva and I started our journey through the intricacies of Ebola with a very common source: W.H.O. By using the data provided by W.H.O to create graphs that show the progression of Ebola cases in Africa, we were able to ascertain many potential models for the disease. The three countries that are suffering most in Africa from the Ebola epidemic are Guinea, Liberia and Sierra Leone. However, upon inspection of the data, it was decided that Liberia seems to be suffering the most from an outbreak-styled epidemic. To clarify, this is indicated by the fact that graphed data of Liberia’s cases and deaths are almost exclusively concave up and increasing. What makes this case so interesting to both Dr. Martcheva and me is not only the contemporary topic mixed with the feeling that the disease is at our doorstep. It is also the peculiarity of studying a disease that is in the midst of its outbreaks; as shown in Mathematical Models in Population Biology and Epidemiology, written by Fred Brauer and Chavez, the usual type of disease to be subjected to study in the case of an epidemic is a disease that has reached the peak of its outbreak and has declined in the number of cases, or will possibly diminish. (346). Scores of mathematicians have discussed the outbreaks of several epidemic diseases that occurred decades prior, and there have even been analyses of Ebola outbreaks in the Democratic Republic of the Congo in the 1990s by mathematicians such as Lekone (1170) In all of these studies, there were a number of solutions for controlling the Ebola virus, but none of them took into the account the possibility of Ebola becoming this problematic. In the case of the Ebola outbreak, however, the virus seems to only be getting worse in Liberia, with the concave-up pattern showing little signs of ceasing. Thus, the uncertainty in our models rise, and the need to approach carefully with each step we make increases. So why bother at all? It may appear that the better solution would be to wait until the disease has run its course before proceeding to study it. However, what mathematical models of Ebola can hopefully discover is the very thing that will be necessary to put a stop to the rampant virus. By assigning parameters to infectivity between susceptible people and infectious as well as the rate at which people go from each class to the next (susceptible to infectious, infectious to recovered, etc.), we may begin to change values and assign initial values for the ordinary differential equations so that the outcome will match the data to the best possible estimate, in terms of both uncertainty and number of parameters. Upon finding the best estimate, we shall analyze the reasons for its superiority to the previous models, and use these reasons to ascertain the best control methods for the virus, as well as the parameters that seem to have the greatest effect on Ebola. In doing so, we hope to not only gain knowledge that will further our ability to deal with the current outbreak, but outbreaks in the future as well.