941 resultados para Analysis Model
Resumo:
In this paper we present the development and the implementation of a content analysis model for observing aspects relating to the social mission of the public library on Facebook pages and websites. The model is unique and it was developed from the literature. There were designed the four categories for analysis Generate social capital and social cohesion, Consolidate democracy and citizenship, Social and digital inclusion and Fighting illiteracies. The model enabled the collection and the analysis of data applied to a case study consisting of 99 Portuguese public libraries with Facebook page. With this model of content analysis we observed the facets of social mission and we read the actions with social facets on the Facebook page and in the websites of public libraries. At the end we discuss in parallel the results of observation of the Facebook of libraries and the websites. By reading the description of the actions of the social mission, the general conclusion and the most immediate is that 99 public libraries on Facebook and websites rarely publish social character actions, and the results are little satisfying. The Portuguese public libraries highlight substantially the actions in the category Generate social capital and social cohesion.
Resumo:
The goal of this project is to learn the necessary steps to create a finite element model, which can accurately predict the dynamic response of a Kohler Engines Heavy Duty Air Cleaner (HDAC). This air cleaner is composed of three glass reinforced plastic components and two air filters. Several uncertainties arose in the finite element (FE) model due to the HDAC’s component material properties and assembly conditions. To help understand and mitigate these uncertainties, analytical and experimental modal models were created concurrently to perform a model correlation and calibration. Over the course of the project simple and practical methods were found for future FE model creation. Similarly, an experimental method for the optimal acquisition of experimental modal data was arrived upon. After the model correlation and calibration was performed a validation experiment was used to confirm the FE models predictive capabilities.
Resumo:
The purpose of this project is to develop an investment analysis model that integrates the capabilities of four types of analysis for use in evaluating interurban transportation system improvements. The project will also explore the use of new data warehousing and mining techniques to design the types of databases required for supporting such a comprehensive transportation model. The project consists of four phases. The first phase, which is documented in this report, involves development of the conceptual foundation for the model. Prior research is reviewed in Chapter 1, which is composed of three major sections providing demand modeling background information for passenger transportation, transportation of freight (manufactured products and supplies), and transportation of natural resources and agricultural commodities. Material from the literature on geographic information systems makes up Chapter 2. Database models for the national and regional economies and for the transportation and logistics network are conceptualized in Chapter 3. Demand forecasting of transportation service requirements is introduced in Chapter 4, with separate sections for passenger transportation, freight transportation, and transportation of natural resources and commodities. Characteristics and capacities of the different modes, modal choices, and route assignments are discussed in Chapter 5. Chapter 6 concludes with a general discussion of the economic impacts and feedback of multimodal transportation activities and facilities.
Resumo:
The occupational exposure limits of different risk factors for development of low back disorders (LBDs) have not yet been established. One of the main problems in setting such guidelines is the limited understanding of how different risk factors for LBDs interact in causing injury, since the nature and mechanism of these disorders are relatively unknown phenomena. Industrial ergonomists' role becomes further complicated because the potential risk factors that may contribute towards the onset of LBDs interact in a complex manner, which makes it difficult to discriminate in detail among the jobs that place workers at high or low risk of LBDs. The purpose of this paper was to develop a comparative study between predictions based on the neural network-based model proposed by Zurada, Karwowski & Marras (1997) and a linear discriminant analysis model, for making predictions about industrial jobs according to their potential risk of low back disorders due to workplace design. The results obtained through applying the discriminant analysis-based model proved that it is as effective as the neural network-based model. Moreover, the discriminant analysis-based model proved to be more advantageous regarding cost and time savings for future data gathering.
Resumo:
In this work, all publicly-accessible published findings on Alicyclobacillus acidoterrestris heat resistance in fruit beverages as affected by temperature and pH were compiled. Then, study characteristics (protocols, fruit and variety, °Brix, pH, temperature, heating medium, culture medium, inactivation method, strains, etc.) were extracted from the primary studies, and some of them incorporated to a meta-analysis mixed-effects linear model based on the basic Bigelow equation describing the heat resistance parameters of this bacterium. The model estimated mean D* values (time needed for one log reduction at a temperature of 95 °C and a pH of 3.5) of Alicyclobacillus in beverages of different fruits, two different concentration types, with and without bacteriocins, and with and without clarification. The zT (temperature change needed to cause one log reduction in D-values) estimated by the meta-analysis model were compared to those ('observed' zT values) reported in the primary studies, and in all cases they were within the confidence intervals of the model. The model was capable of predicting the heat resistance parameters of Alicyclobacillus in fruit beverages beyond the types available in the meta-analytical data. It is expected that the compilation of the thermal resistance of Alicyclobacillus in fruit beverages, carried out in this study, will be of utility to food quality managers in the determination or validation of the lethality of their current heat treatment processes.
Resumo:
The Montreal Process indicators are intended to provide a common framework for assessing and reviewing progress toward sustainable forest management. The potential of a combined geometrical-optical/spectral mixture analysis model was assessed for mapping the Montreal Process age class and successional age indicators at a regional scale using Landsat Thematic data. The project location is an area of eucalyptus forest in Emu Creek State Forest, Southeast Queensland, Australia. A quantitative model relating the spectral reflectance of a forest to the illumination geometry, slope, and aspect of the terrain surface and the size, shape, and density, and canopy size. Inversion of this model necessitated the use of spectral mixture analysis to recover subpixel information on the fractional extent of ground scene elements (such as sunlit canopy, shaded canopy, sunlit background, and shaded background). Results obtained fron a sensitivity analysis allowed improved allocation of resources to maximize the predictive accuracy of the model. It was found that modeled estimates of crown cover projection, canopy size, and tree densities had significant agreement with field and air photo-interpreted estimates. However, the accuracy of the successional stage classification was limited. The results obtained highlight the potential for future integration of high and moderate spatial resolution-imaging sensors for monitoring forest structure and condition. (C) Elsevier Science Inc., 2000.
Resumo:
The Cultural Property Risk Analysis Model was applied in 2006 to a Portuguese archive located in Lisbon. Its results highlighted the need for the institution to take care of risks related to fire, physical forces and relative humidity problems. Five years after this first analysis the results are revisited and a few changes are introduced due to recent events: fire and high humidity remain an important hazard but are now accompanied by a pressing contaminants problem. Improvements in storage systems were responsible for a large decrease in terms of calculated risk magnitude and proved to be very cost-effective.
Resumo:
Fieldbus networks aim at the interconnection of field devices such as sensors, actuators and small controllers. Therefore, they are an effective technology upon which Distributed Computer Controlled Systems (DCCS) can be built. DCCS impose strict timeliness requirements to the communication network. In essence, by timeliness requirements we mean that traffic must be sent and received within a bounded interval, otherwise a timing fault is said to occur. P-NET is a multi-master fieldbus standard based on a virtual token passing scheme. In P-NET each master is allowed to transmit only one message per token visit, which means that in the worst-case the communication response time could be derived considering that the token is fully utilised by all stations. However, such analysis can be proved to be quite pessimistic. In this paper we propose a more sophisticated P-NET timing analysis model, which considers the actual token utilisation by different masters. The major contribution of this model is to provide a less pessimistic, and thus more accurate, analysis for the evaluation of the worst-case communication response time in P-NET fieldbus networks.
Resumo:
Comunicação apresentada no 8º Congresso Nacional de Administração Pública - Desafios e Soluções, em Carcavelos de 21 a 22 de Novembro de 2011.
Resumo:
Thesis submitted to the Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia for the degree of Doctor of Philosophy in Environmental Engineering
Resumo:
Intensification of agricultural production without a sound management and regulations can lead to severe environmental problems, as in Western Santa Catarina State, Brazil, where intensive swine production has caused large accumulations of manure and consequently water pollution. Natural resource scientists are asked by decision-makers for advice on management and regulatory decisions. Distributed environmental models are useful tools, since they can be used to explore consequences of various management practices. However, in many areas of the world, quantitative data for model calibration and validation are lacking. The data-intensive distributed environmental model AgNPS was applied in a data-poor environment, the upper catchment (2,520 ha) of the Ariranhazinho River, near the city of Seara, in Santa Catarina State. Steps included data preparation, cell size selection, sensitivity analysis, model calibration and application to different management scenarios. The model was calibrated based on a best guess for model parameters and on a pragmatic sensitivity analysis. The parameters were adjusted to match model outputs (runoff volume, peak runoff rate and sediment concentration) closely with the sparse observed data. A modelling grid cell resolution of 150 m adduced appropriate and computer-fit results. The rainfall runoff response of the AgNPS model was calibrated using three separate rainfall ranges (< 25, 25-60, > 60 mm). Predicted sediment concentrations were consistently six to ten times higher than observed, probably due to sediment trapping along vegetated channel banks. Predicted N and P concentrations in stream water ranged from just below to well above regulatory norms. Expert knowledge of the area, in addition to experience reported in the literature, was able to compensate in part for limited calibration data. Several scenarios (actual, recommended and excessive manure applications, and point source pollution from swine operations) could be compared by the model, using a relative ranking rather than quantitative predictions.
Can the administration be trusted? An analysis of the concept of trust, applied to the public sector
Resumo:
In the first part of this paper, we present the various academic debates and, where applicable, questions that remain open in the literature, particularly regarding the nature of trust, the distinction between trust and trustworthiness, its role in specific relationships and its relationship to control. We then propose a way of demarcating and operationalizing the concepts of trust and trustworthiness. In the second part, on the basis of the conceptual clarifications we present, we put forward a number of "anchor points" regarding how trust is apprehended in the public sector with regard to the various relations hips that can be studied. Schematically, we distinguish between two types of relations hips in the conceptual approach to trust: on one hand, the trust that citizens, or third parties, place in the State or in various public sector authorities or entities, and on the other hand, trust within the State or the public sector, between its various authorities, entities, and actors. While studies have traditionally focused on citizens' trust in their institutions, the findings, limitations and problems observed in public - sector coordination following the reforms associated with New Public Management have also elicited growing interest in the study of trust in the relationships between the various actors within the public sector. Both the theoretical debates we present and our propositions have been extracted and adapted from an empirical comparative study of coordination between various Swiss public - service organizations and their politico - administrative authority. Using the analysis model developed for this specific relationship, between various actors within the public service, and in the light of theoretical elements on which development of this model was based, we propose some avenues for further study - questions that remain open - regarding the consideration and understanding of citizens' trust in the public sector.
Resumo:
The Failure Mode and Effect Analysis (FMEA) was applied for risk assessment of confectionary manufacturing, in whichthe traditional methods and equipment were intensively used in the production. Potential failure modes and effects as well as their possible causes were identified in the process flow. Processing stages that involve intensive handling of food by workers had the highest risk priority numbers (RPN = 216 and 189), followed by chemical contamination risks in different stages of the process. The application of corrective actions substantially reduced the RPN (risk priority number) values. Therefore, the implementation of FMEA (The Failure Mode and Effect Analysis) model in confectionary manufacturing improved the safety and quality of the final products.
Resumo:
Please consult the paper edition of this thesis to read. It is available on the 5th Floor of the Library at Call Number: Z 9999 E38 D56 1992
Resumo:
Factor analysis as frequent technique for multivariate data inspection is widely used also for compositional data analysis. The usual way is to use a centered logratio (clr) transformation to obtain the random vector y of dimension D. The factor model is then y = Λf + e (1) with the factors f of dimension k < D, the error term e, and the loadings matrix Λ. Using the usual model assumptions (see, e.g., Basilevsky, 1994), the factor analysis model (1) can be written as Cov(y) = ΛΛT + ψ (2) where ψ = Cov(e) has a diagonal form. The diagonal elements of ψ as well as the loadings matrix Λ are estimated from an estimation of Cov(y). Given observed clr transformed data Y as realizations of the random vector y. Outliers or deviations from the idealized model assumptions of factor analysis can severely effect the parameter estimation. As a way out, robust estimation of the covariance matrix of Y will lead to robust estimates of Λ and ψ in (2), see Pison et al. (2003). Well known robust covariance estimators with good statistical properties, like the MCD or the S-estimators (see, e.g. Maronna et al., 2006), rely on a full-rank data matrix Y which is not the case for clr transformed data (see, e.g., Aitchison, 1986). The isometric logratio (ilr) transformation (Egozcue et al., 2003) solves this singularity problem. The data matrix Y is transformed to a matrix Z by using an orthonormal basis of lower dimension. Using the ilr transformed data, a robust covariance matrix C(Z) can be estimated. The result can be back-transformed to the clr space by C(Y ) = V C(Z)V T where the matrix V with orthonormal columns comes from the relation between the clr and the ilr transformation. Now the parameters in the model (2) can be estimated (Basilevsky, 1994) and the results have a direct interpretation since the links to the original variables are still preserved. The above procedure will be applied to data from geochemistry. Our special interest is on comparing the results with those of Reimann et al. (2002) for the Kola project data