Project 2-Lake Pollution Problem Modeling is a common analytical tool used to predict outcomes in the real world. Modeling is very useful both to help us avoid harmful episodes before they happen but also to find ways to remedy serious problems after they have occurred. Modeling can save both time and money and often can help our studies in ways that may otherwise be impossible. The case in point is a very good example of such a project. A lake is so large that it would be difficult to measure changes until the pollution was well underway, and even then sampling procedures could give us very different conclusions. The purpose of this project is to model a lake that is slowly being polluted by a chemical herbicide called WeedBeGone. Our first objective was to set up the formulas to calculate values for the build up of the herbicide concentrations in the lake, using the parameters that had been given to us. The lake is 100 acres in area, has a river running into it and a river running out of it, which maintains a balance of the lake volume. The lake has a main source of pollution, which is from a chemical plant, as well as a small background amount entering the lake from the river. The basic format was set up for us in Excel, and the flow rates and pollution rates were given. After we calculated the concentration of pollutant in the lake over a period of 360 days, we plotted these points, which showed the beginning of a logarithmic shaped curve. For the next step we extended the time frame of pollutant concentration to 1800 days which, when plotted, displayed a plateau of pollutant concentration at a level of 10^-4. Our next task was to redesign the factory discharge level, using Excel Solver, to produce a maximum WeedBeGone concentration level of 10^-5. We accomplished this and plotted the result. Next, we included a rate for degradation of the pollutant in the calculations, and found a new, lower, maximum level of pollutant concentration. This makes sense because the degradation doesn’t allow the pollutant levels to rise as high as would otherwise happen. This is also the reason that environmental scientists and government regulators are frequently concerned with the longevity of toxic chemicals and try to find ways to increase the degradation rates of such hazardous substances, when their usefulness can’t be dispensed with. Our last task was to redo this project in Matlab, and after some struggle, we succeeded. Our group started out this project at our PSS on Wednesday, Nov. 29th. We worked simultaneously on the problem at the beginning of class then we worked together. At the end of PSS we made copies of our file, and divided up to work on our own until we met again on Friday, Dec. 1st. We met again on Friday and put our individual efforts together and were 95% done at that time. We finished up our project on Dec. 6th. While we were working on the programs, one member worked on the report. A copy of the report was emailed to the other members for review and editing. We finished up the report on Wednesday, Dec. 6th. It was surprising and satisfying to note our progress in working together effectively as a team. On this project we didn’t wait to get started but jumped into it right away and were able to avoid some pitfalls of previous experience. We really had come together to function interactively, as a healthy team should, with a common goal and desire to see this interesting project through to completion. In our work we used basic Excel formula procedures, graphing and the Solver functions. In Matlab we used basic programming procedures and the plotting function. We are satisfied with our applications of this project as specified, but, if we had time to make this model more useful as a predictor, we would try to add more characteristics of the systems’ natural functioning. Some of these aspects involve the effects of sedimentation, volatility, and bio-uptake on the concentration level of WeedBeGone in the lake. We might also consider the effects of seasonal variations in rainfall, evaporation, and runoff. The final bill for this project includes: 1.) Matthew Freeborn = $307.20. 2.) Steve Freund= $307.20. 3.) Lamont Brown= $307.20. 4.) Larry Gregus= $307.20. For a total cost of $1,228.80. The breakdown for each member of our group was (Labor = 8 hours @ $36/hr = $288, + $19.2 for the cost of the computer. Equals $307.20 each. Modeling as a tool for analysis We began talking of the procedure of using a computational model to represent the behavior of a system. This system can be either a natural one with some effects imposed on it of either natural or man made origin, or it is a system of a manmade nature trying to exist in the structure of the real world. Frequently, the system under study is too large to make accurate meaningful measurements of, or the system has a very complicated interrelationship of elements affecting it that even the most minute influence could change the outcome dramatically. A good example of the latter is aircraft design. This is a manmade system, which must utilize the forces of nature so that an aircraft of superior qualities results, but where even the most seemingly insignificant change, can make a serious difference. Modeling allows us to study the effects of a variety of factors in a meaningful way with a more acceptable cost, time frame and without risk to the actual systems themselves. Such modeling allows many adjustments to be made more quickly, so that poor choices can be quickly dropped, while advantageous items can be quickly adopted. Back when the illustrious Wright brothers were trying to get their “plane off the ground”, each change they made took many hours of work and delay before it could be accurately tested, and often at the risk of their lives, while always being subject to the whimsy of the weather. They had to use careful judgement to determine if an effect was caused by the design or a caprice of the wind or other natural conditions. At the present time in history, we have the tremendous advantage to be able to test innovations in unobtrusive and specific ways, at minimal risk. In modeling it may sometimes seem that there is a large margin of error that can make results meaningless. This is a distinct possibility, so it is very important to design a model for a specific purpose. We must realize what our model can tell us, as well as its limitations. As a case in point, our lake pollution model can give us the overall information of the amount of pollution possible in the lake at any moment in time, but with perhaps a 10- 20% margin of error. It can, however, tell us that the pollution level is increasing and what should be its’ maximum level of pollutant concentration possible. Also, this model may indicate if there are other unknown sources of pollution emptying into this lake. As to limitations of our model, it can’t tell us what the level of pollutant is at any particular location in the lake, and sampling may suggest a much lower level of existing pollution. This seeming inconsistency could be due to the possible settling of the effluent into the layers of sediment. This is precisely what happened with PCB’s in lake Erie. The waters between the U.S. and Canada were considered relatively free of PCB’s until a dredging operation opened up the underlying layers of sediment exposing previous PCB contamination. Therefore, a model must be designed very carefully to be able to inform us in the ways we are intending, while we must be careful not to be fooled into thinking a model can tell us more than its design allows.