994 resultados para calculation tool
Resumo:
Illicit drug analyses usually focus on the identification and quantitation of questioned material to support the judicial process. In parallel, more and more laboratories develop physical and chemical profiling methods in a forensic intelligence perspective. The analysis of large databases resulting from this approach enables not only to draw tactical and operational intelligence, but may also contribute to the strategic overview of drugs markets. In Western Switzerland, the chemical analysis of illicit drug seizures is centralised in a laboratory hosted by the University of Lausanne. For over 8 years, this laboratory has analysed 5875 cocaine and 2728 heroin specimens, coming from respectively 1138 and 614 seizures operated by police and border guards or customs. Chemical (major and minor alkaloids, purity, cutting agents, chemical class), physical (packaging and appearance) as well as circumstantial (criminal case number, mass of drug seized, date and place of seizure) information are collated in a dedicated database for each specimen. The study capitalises on this extended database and defines several indicators to characterise the structure of drugs markets, to follow-up on their evolution and to compare cocaine and heroin markets. Relational, spatial, temporal and quantitative analyses of data reveal the emergence and importance of distribution networks. They enable to evaluate the cross-jurisdictional character of drug trafficking and the observation time of drug batches, as well as the quantity of drugs entering the market every year. Results highlight the stable nature of drugs markets over the years despite the very dynamic flows of distribution and consumption. This research work illustrates how the systematic analysis of forensic data may elicit knowledge on criminal activities at a strategic level. In combination with information from other sources, such knowledge can help to devise intelligence-based preventive and repressive measures and to discuss the impact of countermeasures.
Resumo:
COD discharges out of processes have increased in line with elevating brightness demands for mechanical pulp and papers. The share of lignin-like substances in COD discharges is on average 75%. In this thesis, a plant dynamic model was created and validated as a means to predict COD loading and discharges out of a mill. The assays were carried out in one paper mill integrate producing mechanical printing papers. The objective in the modeling of plant dynamics was to predict day averages of COD load and discharges out of mills. This means that online data, like 1) the level of large storage towers of pulp and white water 2) pulp dosages, 3) production rates and 4) internal white water flows and discharges were used to create transients into the balances of solids and white water, referred to as “plant dynamics”. A conversion coefficient was verified between TOC and COD. The conversion coefficient was used for predicting the flows from TOC to COD to the waste water treatment plant. The COD load was modeled with similar uncertainty as in reference TOC sampling. The water balance of waste water treatment was validated by the reference concentration of COD. The difference of COD predictions against references was within the same deviation of TOC-predictions. The modeled yield losses and retention values of TOC in pulping and bleaching processes and the modeled fixing of colloidal TOC to solids between the pulping plant and the aeration basin in the waste water treatment plant were similar to references presented in literature. The valid water balances of the waste water treatment plant and the reduction model of lignin-like substances produced a valid prediction of COD discharges out of the mill. A 30% increase in the release of lignin-like substances in the form of production problems was observed in pulping and bleaching processes. The same increase was observed in COD discharges out of waste water treatment. In the prediction of annual COD discharge, it was noticed that the reduction of lignin has a wide deviation from year to year and from one mill to another. This made it difficult to compare the parameters of COD discharges validated in plant dynamic simulation with another mill producing mechanical printing papers. However, a trend of moving from unbleached towards high-brightness TMP in COD discharges was valid.
Resumo:
The present paper is a review about basic principles of the molecular mechanics that is the most important tool used in molecular modeling area, and their applications to the calculation of the relative stability and chemical reactivity of organometalic and coordination compounds. We show how molecular mechanics can be successfully applied to a wide variety of inorganic systems.
Resumo:
During the last few years, the discussion on the marginal social costs of transportation has been active. Applying the externalities as a tool to control transport would fulfil the polluter pays principle and simultaneously create a fair control method between the transport modes. This report presents the results of two calculation algorithms developed to estimate the marginal social costs based on the externalities of air pollution. The first algorithm calculates the future scenarios of sea transport traffic externalities until 2015 in the Gulf of Finland. The second algorithm calculates the externalities of Russian passenger car transit traffic via Finland by taking into account both sea and road transport. The algorithm estimates the ship-originated emissions of carbon dioxide (CO2), nitrogen oxides (NOx), sulphur oxides (SOx), particulates (PM) and the externalities for each year from 2007 to 2015. The total NOx emissions in the Gulf of Finland from the six ship types were almost 75.7 kilotons (Table 5.2) in 2007. The ship types are: passenger (including cruisers and ROPAX vessels), tanker, general cargo, Ro-Ro, container and bulk vessels. Due to the increase of traffic, the estimation for NOx emissions for 2015 is 112 kilotons. The NOx emission estimation for the whole Baltic Sea shipping is 370 kilotons in 2006 (Stipa & al, 2007). The total marginal social costs due to ship-originated CO2, NOx, SOx and PM emissions in the GOF were calculated to almost 175 million Euros in 2007. The costs will increase to nearly 214 million Euros in 2015 due to the traffic growth. The major part of the externalities is due to CO2 emissions. If we neglect the CO2 emissions by extracting the CO2 externalities from the results, we get the total externalities of 57 million Euros in 2007. After eight years (2015), the externalities would be 28 % lower, 41 million Euros (Table 8.1). This is the result of the sulphur emissions reducing regulation of marine fuels. The majority of the new car transit goes through Finland to Russia due to the lack of port capacity in Russia. The amount of cars was 339 620 vehicles (Statistics of Finnish Customs 2008) in 2005. The externalities are calculated for the transportation of passenger vehicles as follows: by ship to a Finnish port and, after that, by trucks to the Russian border checkpoint. The externalities are between 2 – 3 million Euros (year 2000 cost level) for each route. The ports included in the calculations are Hamina, Hanko, Kotka and Turku. With the Euro-3 standard trucks, the port of Hanko would be the best choice to transport the vehicles. This is because of lower emissions by new trucks and the saved transport distance of a ship. If the trucks are more polluting Euro 1 level trucks, the port of Kotka would be the best choice. This indicates that the truck emissions have a considerable effect on the externalities and that the transportation of light cargo, such as passenger cars by ship, produces considerably high emission externalities. The emission externalities approach offers a new insight for valuing the multiple traffic modes. However, the calculation of the marginal social costs based on the air emission externalities should not be regarded as a ready-made calculation system. The system is clearly in the need of some improvement but it can already be considered as a potential tool for political decision making.
Resumo:
Public opinion surveys have become progressively incorporated into systems of official statistics. Surveys of the economic climate are usually qualitative because they collect opinions of businesspeople and/or experts about the long-term indicators described by a number of variables. In such cases the responses are expressed in ordinal numbers, that is, the respondents verbally report, for example, whether during a given trimester the sales or the new orders have increased, decreased or remained the same as in the previous trimester. These data allow to calculate the percent of respondents in the total population (results are extrapolated), who select every one of the three options. Data are often presented in the form of an index calculated as the difference between the percent of those who claim that a given variable has improved in value and of those who claim that it has deteriorated.
Resumo:
Nonnative brook trout Salvelinus fontinalis are abundant in Pine Creek and its main tributary, Bogard Spring Creek, California. These creeks historically provided the most spawning and rearing habitat for endemic Eagle Lake rainbow trout Oncorhynchus mykiss aquilarum. Three-pass electrofishing removal was conducted in 2007–2009 over the entire 2.8-km length of Bogard Spring Creek to determine whether brook trout removal was a feasible restoration tool and to document the life history characteristics of brook trout in a California meadow stream. After the first 2 years of removal, brook trout density and biomass were severely reduced from 15,803 to 1,192 fish/ha and from 277 to 31 kg/ha, respectively. Average removal efficiency was 92–97%, and most of the remaining fish were removed in the third year. The lack of a decrease in age-0 brook trout abundance between 2007 and 2008 after the removal of more than 4,000 adults in 2007 suggests compensatory reproduction of mature fish that survived and higher survival of age-0 fish. However, recruitment was greatly reduced after 2 years of removal and is likely to be even more depressed after the third year of removal assuming that immigration of fish from outside the creek continues to be minimal. Brook trout condition, growth, and fecundity indicated a stunted population at the start of the study, but all three features increased significantly every year, demonstrating compensatory effects. Although highly labor intensive, the use of electrofishing to eradicate brook trout may be feasible in Bogard Spring Creek and similar small streams if removal and monitoring are continued annually and if other control measures (e.g., construction of barriers) are implemented. Our evidence shows that if brook trout control measures continue and if only Eagle Lake rainbow trout are allowed access to the creek, then a self-sustaining population ofEagle Lake rainbow trout can become reestablished
Resumo:
Among increasingly used pharmaceutical products, β-blockers have been commonly reported at low concentrations in rivers and littoral waters of Europe and North America. Little is known about the toxicity of these chemicals in freshwater ecosystems while their presence may lead to chronic pollution. Hence, in this study the acute toxicity of 3 β-blockers: metoprolol, propranolol and atenolol on fluvial biofilms was assessed by using several biomarkers. Some were indicative of potential alterations in biofilm algae (photosynthetic efficiency), and others in biofilm bacteria (peptidase activity, bacterial mortality). Propranolol was the most toxic β-blocker, mostly affecting the algal photosynthetic process. The exposure to 531 μg/L of propranolol caused 85% of inhibition of photosynthesis after 24 h. Metoprolol was particularly toxic for bacteria. Though estimated No-Effect Concentrations (NEC) were similar to environmental concentrations, higher concentrations of the toxic (503 μg/L metoprolol) caused an increase of 50% in bacterial mortality. Atenolol was the least toxic of the three tested β-blockers. Effects superior to 50% were only observed at very high concentration (707 mg/L). Higher toxicity of metoprolol and propranolol might be due to better absorption within biofilms of these two chemicals. Since β-blockers are mainly found in mixtures in rivers, their differential toxicity could have potential relevant consequences on the interactions between algae and bacteria within river biofilms
Resumo:
One of the techniques used to detect faults in dynamic systems is analytical redundancy. An important difficulty in applying this technique to real systems is dealing with the uncertainties associated with the system itself and with the measurements. In this paper, this uncertainty is taken into account by the use of intervals for the parameters of the model and for the measurements. The method that is proposed in this paper checks the consistency between the system's behavior, obtained from the measurements, and the model's behavior; if they are inconsistent, then there is a fault. The problem of detecting faults is stated as a quantified real constraint satisfaction problem, which can be solved using the modal interval analysis (MIA). MIA is used because it provides powerful tools to extend the calculations over real functions to intervals. To improve the results of the detection of the faults, the simultaneous use of several sliding time windows is proposed. The result of implementing this method is semiqualitative tracking (SQualTrack), a fault-detection tool that is robust in the sense that it does not generate false alarms, i.e., if there are false alarms, they indicate either that the interval model does not represent the system adequately or that the interval measurements do not represent the true values of the variables adequately. SQualTrack is currently being used to detect faults in real processes. Some of these applications using real data have been developed within the European project advanced decision support system for chemical/petrochemical manufacturing processes and are also described in this paper
Resumo:
Customer profitability accounting is a well-researched topic in the academic field, and it has been proved to posses rather undisputable benefits. However, the calculation of the customer profitabilities can be challenging, therefore the usage of the accounting is not self-explanatory in organizations. The aim of this study was to create a customer profitability accounting model for a wholesales unit in the case company to function as a sales management tool. The literature review of the study presents certain fundamental issues related to customer profitability accounting, in addition a theoretical framework for accounting model design is provided. The creation of the model was commenced by setting the requirements for it and examining the foundation of the model design, which consisted of for instance price setting and cost structure of products. This was followed by selecting approaches to the creation of the model. The result of the study was an accounting model, for which a determination of included revenues and costs was executed, along with the formulation of an allocation criteria of the costs. Lastly, the customer profitabilities were calculated in accordance with the accounting principles and the calculation logic of the model. The attained figures proved the model to provide an appropriate solution for obtaining the customer profitabilities and thus to use the accounting information as a sales management tool in for instance decision making and negotiation situations.
Resumo:
In this work we present the formulas for the calculation of exact three-center electron sharing indices (3c-ESI) and introduce two new approximate expressions for correlated wave functions. The 3c-ESI uses the third-order density, the diagonal of the third-order reduced density matrix, but the approximations suggested in this work only involve natural orbitals and occupancies. In addition, the first calculations of 3c-ESI using Valdemoro's, Nakatsuji's and Mazziotti's approximation for the third-order reduced density matrix are also presented for comparison. Our results on a test set of molecules, including 32 3c-ESI values, prove that the new approximation based on the cubic root of natural occupancies performs the best, yielding absolute errors below 0.07 and an average absolute error of 0.015. Furthemore, this approximation seems to be rather insensitive to the amount of electron correlation present in the system. This newly developed methodology provides a computational inexpensive method to calculate 3c-ESI from correlated wave functions and opens new avenues to approximate high-order reduced density matrices in other contexts, such as the contracted Schrödinger equation and the anti-Hermitian contracted Schrödinger equation
Resumo:
To predict the capacity of the structure or the point which is followed by instability, calculation of the critical crack size is important. Structures usually contain several cracks but not necessarily all of these cracks lead to failure or reach the critical size. So, defining the harmful cracks or the crack size which is the most leading one to failure provides criteria for structure’s capacity at elevated temperature. The scope of this thesis was to calculate fracture parameters like stress intensity factor, the J integral and plastic and ultimate capacity of the structure to estimate critical crack size for this specific structure. Several three dimensional (3D) simulations using finite element method by Ansys program and boundary element method by Frank 3D program were carried out to calculate fracture parameters and results with the aid of laboratory tests (loaddisplacement curve, the J resistance curve and yield or ultimate stress) leaded to extract critical size of the crack. Two types of the fracture which is usually affected by temperature, Elastic and Elasti-Plastic fractures were simulated by performing several linear elastic and nonlinear elastic analyses. Geometry details of the weldment; flank angle and toe radius were also studied independently to estimate the location of crack initiation and simulate stress field in early stages of crack extension in structure. In this work also overview of the structure’s capacity in room temperature (20 ºC) was studied. Comparison of the results in different temperature (20 ºC and -40 ºC) provides a threshold of the structure’s behavior within the defined range.
Resumo:
The present article comes from a doctoral thesis that turns on digital learner portfolio, which is an innovating methodology from the perspective of European Higher Education Area. First, the educative concept of eportfolio is described in the sense of its procedure and its structure, by means of the technological support of a platform of virtual campus. Second, it is shown the pedagogical model of an eportfolio that adapts subjects with an instrumental character to one organization based on tasks and reflections. This design of virtual learning environment is based on a teaching- learning methodology sustained in the activity of the student, which tries to give support to the management of his or her own process of learning and assessment. Finally, the article illustrates the experience of implementation of the first digital learner portfolios in the University of Barcelona and the Autonomous University of Barcelona, with the objective of reflecting about the pedagogical consequences that this assessment model with technological support has in a traditional higher education institution.
Resumo:
The aim of this project is to accomplish an application software based on Matlab to calculate the radioelectrical coverage by surface wave of broadcast radiostations in the band of Medium Wave (WM) all around the world. Also, given the location of a transmitting and a receiving station, the software should be able to calculate the electric field that the receiver should receive at that specific site. In case of several transmitters, the program should search for the existence of Inter-Symbol Interference, and calculate the field strenght accordingly. The application should ask for the configuration parameters of the transmitter radiostation within a Graphical User Interface (GUI), and bring back the resulting coverage above a map of the area under study. For the development of this project, it has been used several conductivity databases of different countries, and a high-resolution elevation database (GLOBE). Also, to calculate the field strenght due to groundwave propagation, it has been used ITU GRWAVE program, which must be integrated into a Matlab interface to be used by the application developed.
Resumo:
The present thesis in focused on the minimization of experimental efforts for the prediction of pollutant propagation in rivers by mathematical modelling and knowledge re-use. Mathematical modelling is based on the well known advection-dispersion equation, while the knowledge re-use approach employs the methods of case based reasoning, graphical analysis and text mining. The thesis contribution to the pollutant transport research field consists of: (1) analytical and numerical models for pollutant transport prediction; (2) two novel techniques which enable the use of variable parameters along rivers in analytical models; (3) models for the estimation of pollutant transport characteristic parameters (velocity, dispersion coefficient and nutrient transformation rates) as functions of water flow, channel characteristics and/or seasonality; (4) the graphical analysis method to be used for the identification of pollution sources along rivers; (5) a case based reasoning tool for the identification of crucial information related to the pollutant transport modelling; (6) and the application of a software tool for the reuse of information during pollutants transport modelling research. These support tools are applicable in the water quality research field and in practice as well, as they can be involved in multiple activities. The models are capable of predicting pollutant propagation along rivers in case of both ordinary pollution and accidents. They can also be applied for other similar rivers in modelling of pollutant transport in rivers with low availability of experimental data concerning concentration. This is because models for parameter estimation developed in the present thesis enable the calculation of transport characteristic parameters as functions of river hydraulic parameters and/or seasonality. The similarity between rivers is assessed using case based reasoning tools, and additional necessary information can be identified by using the software for the information reuse. Such systems represent support for users and open up possibilities for new modelling methods, monitoring facilities and for better river water quality management tools. They are useful also for the estimation of environmental impact of possible technological changes and can be applied in the pre-design stage or/and in the practical use of processes as well.
Resumo:
The purpose of this thesis was to define how product carbon footprint analysis and its results can be used in company's internal development as well as in customer and interest group guidance, and how these factors are related to corporate social responsibility. From-cradle-to-gate carbon footprint was calculated for three products; Torino Whole grain barley, Torino Pearl barley, and Elovena Barley grit & oat bran, all of them made of Finnish barley. The carbon footprint of the Elovena product was used to determine carbon footprints for industrial kitchen cooked porridge portions. The basic calculation data was collected from several sources. Most of the data originated from Raisio Group's contractual farmers and Raisio Group's cultivation, processing and packaging specialists. Data from national and European literature and database sources was also used. The electricity consumption for porridge portions' carbon footprint calculations was determined with practical measurements. The carbon footprint calculations were conducted according to the ISO 14044 standard, and the PAS 2050 guide was also applied. A consequential functional unit was applied in porridge portions' carbon footprint calculations. Most of the emissions from barley products' life cycle originate from primary production. The nitrous oxide emissions from cultivated soil and the use and production of nitrogenous fertilisers contribute over 50% of products' carbon footprint. Torino Pearl barley has the highest carbon footprint due to the lowest processing output. The reductions in products' carbon footprint can be achieved with developments in cultivation and grain processing. The carbon footprint of porridge portion can be reduced by using domestically produced plant-based ingredients and by making the best possible use of the kettle. Carbon footprint calculation can be used to determine possible improvement points related to corporate environmental responsibility. Several improvement actions are related to economical and social responsibility through better raw material utilization and expense reductions.