928 resultados para modeling and visualization


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Queueing systems constitute a central tool in modeling and performance analysis. These types of systems are in our everyday life activities, and the theory of queueing systems was developed to provide models for forecasting behaviors of systems subject to random demand. The practical and useful applications of the discrete-time queues make the researchers to con- tinue making an e ort in analyzing this type of models. Thus the present contribution relates to a discrete-time Geo/G/1 queue in which some messages may need a second service time in addition to the rst essential service. In day-to-day life, there are numerous examples of queueing situations in general, for example, in manufacturing processes, telecommunication, home automation, etc, but in this paper a particular application is the use of video surveil- lance with intrusion recognition where all the arriving messages require the main service and only some may require the subsidiary service provided by the server with di erent types of strategies. We carry out a thorough study of the model, deriving analytical results for the stationary distribution. The generating functions of the number of messages in the queue and in the system are obtained. The generating functions of the busy period as well as the sojourn times of a message in the server, the queue and the system are also provided.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Optical waveguides have shown promising results for use within printed circuit boards. These optical waveguides have higher bandwidth than traditional copper transmission systems and are immune to electromagnetic interference. Design parameters for these optical waveguides are needed to ensure an optimal link budget. Modeling and simulation methods are used to determine the optimal design parameters needed in designing the waveguides. As a result, optical structures necessary for incorporating optical waveguides into printed circuit boards are designed and optimized. Embedded siloxane polymer waveguides are investigated for their use in optical printed circuit boards. This material was chosen because it has low absorption, high temperature stability, and can be deposited using common processing techniques. Two sizes of waveguides are investigated, 50 $unit{mu m}$ multimode and 4 - 9 $unit{mu m}$ single mode waveguides. A beam propagation method is developed for simulating the multimode and single mode waveguide parameters. The attenuation of simulated multimode waveguides are able to match the attenuation of fabricated waveguides with a root mean square error of 0.192 dB. Using the same process as the multimode waveguides, parameters needed to ensure a low link loss are found for single mode waveguides including maximum size, minimum cladding thickness, minimum waveguide separation, and minimum bend radius. To couple light out-of-plane to a transmitter or receiver, a structure such as a vertical interconnect assembly (VIA) is required. For multimode waveguides the optimal placement of a total internal reflection mirror can be found without prior knowledge of the waveguide length. The optimal placement is found to be either 60 µm or 150 µm away from the end of the waveguide depending on which metric a designer wants to optimize the average output power, the output power variance, or the maximum possible power loss. For single mode waveguides a volume grating coupler is designed to couple light from a silicon waveguide to a polymer single mode waveguide. A focusing grating coupler is compared to a perpendicular grating coupler that is focused by a micro-molded lens. The focusing grating coupler had an optical loss of over -14 dB, while the grating coupler with a lens had an optical loss of -6.26 dB.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Tropospheric ozone (O3) and carbon monoxide (CO) pollution in the Northern Hemisphere is commonly thought to be of anthropogenic origin. While this is true in most cases, copious quantities of pollutants are emitted by fires in boreal regions, and the impact of these fires on CO has been shown to significantly exceed the impact of urban and industrial sources during large fire years. The impact of boreal fires on ozone is still poorly quantified, and large uncertainties exist in the estimates of the fire-released nitrogen oxides (NO x ), a critical factor in ozone production. As boreal fire activity is predicted to increase in the future due to its strong dependence on weather conditions, it is necessary to understand how these fires affect atmospheric composition. To determine the scale of boreal fire impacts on ozone and its precursors, this work combined statistical analysis of ground-based measurements downwind of fires, satellite data analysis, transport modeling and the results of chemical model simulations. The first part of this work focused on determining boreal fire impact on ozone levels downwind of fires, using analysis of observations in several-days-old fire plumes intercepted at the Pico Mountain station (Azores). The results of this study revealed that fires significantly increase midlatitude summertime ozone background during high fire years, implying that predicted future increases in boreal wildfires may affect ozone levels over large regions in the Northern Hemisphere. To improve current estimates of NOx emissions from boreal fires, we further analyzed ΔNOy /ΔCO enhancement ratios in the observed fire plumes together with transport modeling of fire emission estimates. The results of this analysis revealed the presence of a considerable seasonal trend in the fire NOx /CO emission ratio due to the late-summer changes in burning properties. This finding implies that the constant NOx /CO emission ratio currently used in atmospheric modeling is unrealistic, and is likely to introduce a significant bias in the estimated ozone production. Finally, satellite observations were used to determine the impact of fires on atmospheric burdens of nitrogen dioxide (NO2 ) and formaldehyde (HCHO) in the North American boreal region. This analysis demonstrated that fires dominated the HCHO burden over the fires and in plumes up to two days old. This finding provides insights into the magnitude of secondary HCHO production and further enhances scientific understanding of the atmospheric impacts of boreal fires.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Carbon Monoxide (CO) and Ozone (O3) are considered to be one of the most important atmospheric pollutants in the troposphere with both having significant effects on human health. Both are included in the U.S. E.P.A list of criteria pollutants. CO is primarily emitted in the source region whereas O3 can be formed near the source, during transport of the pollution plumes containing O3 precursors or in a receptor region as the plumes subside. The long chemical lifetimes of both CO and O3 enable them to be transported over long distances. This transport is important on continental scales as well, commonly referred to as inter-continental transport and affects the concentrations of both CO and O3 in downwind receptor regions, thereby having significant implications for their air quality standards. Over the period 2001-2011, there have been decreases in the anthropogenic emissions of CO and NOx in North America and Europe whereas the emissions over Asia have increased. How these emission trends have affected concentrations at remote sites located downwind of these continents is an important question. The PICO-NARE observatory located on the Pico Mountain in Azores, Portugal is frequently impacted by North American pollution outflow (both anthropogenic and biomass burning) and is a unique site to investigate long range transport from North America. This study uses in-situ observations of CO and O3 for the period 2001-2011 at PICO-NARE coupled with output from the full chemistry (with normal and fixed anthropogenic emissions) and tagged CO simulations in GEOS-Chem, a global 3-D chemical transport model of atmospheric composition driven by meteorological input from the Goddard Earth Observing System (GEOS) of the NASA Global Modeling and Assimilation Office, to determine and interpret the trends in CO and O3 concentrations over the past decade. These trends would be useful in ascertaining the impacts emission reductions in the United States have had over Pico and in general over the North Atlantic. A regression model with sinusoidal functions and a linear trend term was fit to the in-situ observations and the GEOS-Chem output for CO and O3 at Pico respectively. The regression model yielded decreasing trends for CO and O3 with the observations (-0.314 ppbv/year & -0.208 ppbv/year respectively) and the full chemistry simulation with normal emissions (-0.343 ppbv/year & -0.526 ppbv/year respectively). Based on analysis of the results from the full chemistry simulation with fixed anthropogenic emissions and the tagged CO simulation it was concluded that the decreasing trends in CO were a consequence of the anthropogenic emission changes in regions such as USA and Asia. The emission reductions in USA are countered by Asian increases but the former have a greater impact resulting in decreasing trends for CO at PICO-NARE. For O3 however, it is the increase in water vapor content (which increases O3 destruction) along the pathways of transport from North America to PICO-NARE as well as around the site that has resulted in decreasing trends over this period. This decrease is offset by increase in O3 concentrations due to anthropogenic influence which could be due to increasing Asian emissions of O3 precursors as these emissions have decreased over the US. However, the anthropogenic influence does not change the final direction of the trend. It can thus be concluded that CO and O3 concentrations at PICO-NARE have decreased over 2001-2011.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Managed lane strategies are innovative road operation schemes for addressing congestion problems. These strategies operate a lane (lanes) adjacent to a freeway that provides congestion-free trips to eligible users, such as transit or toll-payers. To ensure the successful implementation of managed lanes, the demand on these lanes need to be accurately estimated. Among different approaches for predicting this demand, the four-step demand forecasting process is most common. Managed lane demand is usually estimated at the assignment step. Therefore, the key to reliably estimating the demand is the utilization of effective assignment modeling processes. Managed lanes are particularly effective when the road is functioning at near-capacity. Therefore, capturing variations in demand and network attributes and performance is crucial for their modeling, monitoring and operation. As a result, traditional modeling approaches, such as those used in static traffic assignment of demand forecasting models, fail to correctly predict the managed lane demand and the associated system performance. The present study demonstrates the power of the more advanced modeling approach of dynamic traffic assignment (DTA), as well as the shortcomings of conventional approaches, when used to model managed lanes in congested environments. In addition, the study develops processes to support an effective utilization of DTA to model managed lane operations. Static and dynamic traffic assignments consist of demand, network, and route choice model components that need to be calibrated. These components interact with each other, and an iterative method for calibrating them is needed. In this study, an effective standalone framework that combines static demand estimation and dynamic traffic assignment has been developed to replicate real-world traffic conditions. With advances in traffic surveillance technologies collecting, archiving, and analyzing traffic data is becoming more accessible and affordable. The present study shows how data from multiple sources can be integrated, validated, and best used in different stages of modeling and calibration of managed lanes. Extensive and careful processing of demand, traffic, and toll data, as well as proper definition of performance measures, result in a calibrated and stable model, which closely replicates real-world congestion patterns, and can reasonably respond to perturbations in network and demand properties.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Social exchange theory and notions of reciprocity have long been assumed to explain the relationship between psychological contract breach and important employee outcomes. To date, however, there has been no explicit testing of these assumptions. This research, therefore, explores the mediating role of negative, generalized, and balanced reciprocity, in the relationships between psychological contract breach and employees’ affective organizational commitment and turnover intentions. A survey of 247 Pakistani employees of a large public university was analyzed using structural equation modeling and bootstrapping techniques, and provided excellent support for our model. As predicted, psychological contract breach was positively related to negative reciprocity norms and negatively related to generalized and balanced reciprocity norms. Negative and generalized (but not balanced) reciprocity were negatively and positively (respectively) related to employees’ affective organizational commitment and fully mediated the relationship between psychological contract breach and affective organizational commitment. Moreover, affective organizational commitment fully mediated the relationship between generalized and negative reciprocity and employees’ turnover intentions. Implications for theory and practice are discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aims of this thesis were to determine the animal health status in organic dairy farms in Europe and to identify drivers for improving the current situation by means of a systemic approach. Prevalences of production diseases were determined in 192 herds in Germany, France, Spain, and Sweden (Paper I), and stakeholder consultations were performed to investigate potential drivers to improve animal health on the sector level (ibid.). Interactions between farm variables were assessed through impact analysis and evaluated to identify general system behaviour and classify components according to their outgoing and incoming impacts (Paper II-III). The mean values and variances of prevalences indicate that the common rules of organic dairy farming in Europe do not result in consistently low levels of production diseases. Stakeholders deemed it necessary to improve the current status and were generally in favour of establishing thresholds for the prevalence of production diseases in organic dairy herds as well as taking actions to improve farms below that threshold. In order to close the gap between the organic principle of health and the organic farming practice, there is the need to formulate a common objective of good animal health and to install instruments to ensure and prove that the aim is followed by all dairy farmers in Europe who sell their products under the organic label. Regular monitoring and evaluation of herd health performance based on reference values are considered preconditions for identifying farms not reaching the target and thus in need of improvement. Graph-based impact analysis was shown to be a suitable method for modeling and evaluating the manifold interactions between farm factors and for identifying the most influential components on the farm level taking into account direct and indirect impacts as well as impact strengths. Variables likely to affect the system as a whole, and the prevalence of production diseases in particular, varied largely between farms despite some general tendencies. This finding reflects the diversity of farm systems and underlines the importance of applying systemic approaches in health management. Reducing the complexity of farm systems and indicating farm-specific drivers, i.e. areas in a farm, where changes will have a large impact, the presented approach has the potential to complement and enrich current advisory practice and to support farmers’ decision-making in terms of animal health.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This is a CoLab Workshop organized as an initiative of the UT Austin | Portugal Program to reinforce the Portuguese competences in Nonlinear Mechanics and in complex problems arising from applications to the mathematical modeling and simulations in the Life Sciences. The Workshop provides a place to exchange recent developments, discoveries and progresses in this challenging research field. The main goal is to bring together doctoral candidates, postdoctoral scientists and graduates interested in the field, giving them the opportunity to make scientific interactions and new connections with established experts in the interdisciplinary topics covered by the event. Another important goal of the Workshop is to promote collaboration between members of the different areas of the UT Austin | Portugal community.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this book are published results of high-tech application of computational modeling and simulation the dynamics of different flows, heat and mass transfer in different fields of science and engineering.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The discovery of new materials and their functions has always been a fundamental component of technological progress. Nowadays, the quest for new materials is stronger than ever: sustainability, medicine, robotics and electronics are all key assets which depend on the ability to create specifically tailored materials. However, designing materials with desired properties is a difficult task, and the complexity of the discipline makes it difficult to identify general criteria. While scientists developed a set of best practices (often based on experience and expertise), this is still a trial-and-error process. This becomes even more complex when dealing with advanced functional materials. Their properties depend on structural and morphological features, which in turn depend on fabrication procedures and environment, and subtle alterations leads to dramatically different results. Because of this, materials modeling and design is one of the most prolific research fields. Many techniques and instruments are continuously developed to enable new possibilities, both in the experimental and computational realms. Scientists strive to enforce cutting-edge technologies in order to make progress. However, the field is strongly affected by unorganized file management, proliferation of custom data formats and storage procedures, both in experimental and computational research. Results are difficult to find, interpret and re-use, and a huge amount of time is spent interpreting and re-organizing data. This also strongly limit the application of data-driven and machine learning techniques. This work introduces possible solutions to the problems described above. Specifically, it talks about developing features for specific classes of advanced materials and use them to train machine learning models and accelerate computational predictions for molecular compounds; developing method for organizing non homogeneous materials data; automate the process of using devices simulations to train machine learning models; dealing with scattered experimental data and use them to discover new patterns.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this work, we explore and demonstrate the potential for modeling and classification using quantile-based distributions, which are random variables defined by their quantile function. In the first part we formalize a least squares estimation framework for the class of linear quantile functions, leading to unbiased and asymptotically normal estimators. Among the distributions with a linear quantile function, we focus on the flattened generalized logistic distribution (fgld), which offers a wide range of distributional shapes. A novel naïve-Bayes classifier is proposed that utilizes the fgld estimated via least squares, and through simulations and applications, we demonstrate its competitiveness against state-of-the-art alternatives. In the second part we consider the Bayesian estimation of quantile-based distributions. We introduce a factor model with independent latent variables, which are distributed according to the fgld. Similar to the independent factor analysis model, this approach accommodates flexible factor distributions while using fewer parameters. The model is presented within a Bayesian framework, an MCMC algorithm for its estimation is developed, and its effectiveness is illustrated with data coming from the European Social Survey. The third part focuses on depth functions, which extend the concept of quantiles to multivariate data by imposing a center-outward ordering in the multivariate space. We investigate the recently introduced integrated rank-weighted (IRW) depth function, which is based on the distribution of random spherical projections of the multivariate data. This depth function proves to be computationally efficient and to increase its flexibility we propose different methods to explicitly model the projected univariate distributions. Its usefulness is shown in classification tasks: the maximum depth classifier based on the IRW depth is proven to be asymptotically optimal under certain conditions, and classifiers based on the IRW depth are shown to perform well in simulated and real data experiments.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Protected crop production is a modern and innovative approach to cultivating plants in a controlled environment to optimize growth, yield, and quality. This method involves using structures such as greenhouses or tunnels to create a sheltered environment. These productive solutions are characterized by a careful regulation of variables like temperature, humidity, light, and ventilation, which collectively contribute to creating an optimal microclimate for plant growth. Heating, cooling, and ventilation systems are used to maintain optimal conditions for plant growth, regardless of external weather fluctuations. Protected crop production plays a crucial role in addressing challenges posed by climate variability, population growth, and food security. Similarly, animal husbandry involves providing adequate nutrition, housing, medical care and environmental conditions to ensure animal welfare. Then, sustainability is a critical consideration in all forms of agriculture, including protected crop and animal production. Sustainability in animal production refers to the practice of producing animal products in a way that minimizes negative impacts on the environment, promotes animal welfare, and ensures the long-term viability of the industry. Then, the research activities performed during the PhD can be inserted exactly in the field of Precision Agriculture and Livestock farming. Here the focus is on the computational fluid dynamic (CFD) approach and environmental assessment applied to improve yield, resource efficiency, environmental sustainability, and cost savings. It represents a significant shift from traditional farming methods to a more technology-driven, data-driven, and environmentally conscious approach to crop and animal production. On one side, CFD is powerful and precise techniques of computer modeling and simulation of airflows and thermo-hygrometric parameters, that has been applied to optimize the growth environment of crops and the efficiency of ventilation in pig barns. On the other side, the sustainability aspect has been investigated and researched in terms of Life Cycle Assessment analyses.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Instrument transformers serve an important role in the protection and isolation of AC electrical systems for measurements of different electrical parameters like voltage, current, power factor, frequency, and energy. As suggested by name these transformers are used in connection with suitable measuring instruments like an ammeter, wattmeter, voltmeter, and energy meters. We have seen how higher voltages and currents are transformed into lower magnitudes to provide isolation between power networks, relays, and other instruments. Reducing transient, suppressing electrical noises in sensitive devices, standardization of instruments and relays up to a few volts and current. Transformer performance directly affects the accuracy of power system measurements and the reliability of relay protection. We classified transformers in terms of purpose, insulating medium, Voltage ranges, temperature ranges, humidity or environmental effect, indoor and outdoor use, performance, Features, specification, efficiency, cost analysis, application, benefits, and limitations which enabled us to comprehend their correct use and selection criteria based on our desired requirements. We also discussed modern Low power instrument transformer products that are recently launched or offered by renowned companies like Schneider Electric, Siemens, ABB, ZIV, G&W etc. These new products are innovations and problem solvers in the domain of measurement, protection, digital communication, advance, and commercial energy metering. Since there is always some space for improvements to explore new advantages of Low power instrument transformers in the domain of their wide linearity, high-frequency range, miniaturization, structural and technological modification, integration, smart frequency modeling, and output prediction of low-power voltage transformers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Relief influences soil texture variability, since it contributes to the time of exposition of the materials to weathering factors. Our work was carried out in the city of Gavião Peixoto (SP), with the objective of characterizing the spatial variability of texture of a dystrophic Red Latosol cultivated with citrus. The hillside was divided into three segments: top, stocking lean and inferior lean. Soil samples were collected in a grid with regular intervals of 50 m, at the depths of 0.0-0.2 m and 0.6-0.8 m, comprising a total of 332 points in an area of 83.5 ha. The data were submitted to descriptive and geostatistics analyses (semivariogram modeling and kriging maps). The spatial behavior of the texture of oxisols is directly related to the relief forms in this study, which controls the direction of surface and subsurface water flows. The concept of homogeneity of clay distribution in the Oxisol profile is a piece of information that can be adjusted by knowing the spatial pattern of this distribution in different relief forms.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Universidade Estadual de Campinas . Faculdade de Educação Física