922 resultados para Location-aware process modeling
Resumo:
The objective of this study was to estimate the spatial distribution of work accident risk in the informal work market in the urban zone of an industrialized city in southeast Brazil and to examine concomitant effects of age, gender, and type of occupation after controlling for spatial risk variation. The basic methodology adopted was that of a population-based case-control study with particular interest focused on the spatial location of work. Cases were all casual workers in the city suffering work accidents during a one-year period; controls were selected from the source population of casual laborers by systematic random sampling of urban homes. The spatial distribution of work accidents was estimated via a semiparametric generalized additive model with a nonparametric bidimensional spline of the geographical coordinates of cases and controls as the nonlinear spatial component, and including age, gender, and occupation as linear predictive variables in the parametric component. We analyzed 1,918 cases and 2,245 controls between 1/11/2003 and 31/10/2004 in Piracicaba, Brazil. Areas of significantly high and low accident risk were identified in relation to mean risk in the study region (p < 0.01). Work accident risk for informal workers varied significantly in the study area. Significant age, gender, and occupational group effects on accident risk were identified after correcting for this spatial variation. A good understanding of high-risk groups and high-risk regions underpins the formulation of hypotheses concerning accident causality and the development of effective public accident prevention policies.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The purpose of this study is to analyze the strategies used by families living in at-risk-and-vulnerable situations registered with the Estratégia Saúde da Família (ESF) ( Family Health Strategy ) as they face their daily problems. This is an investigation of a qualitative nature, using interview as the main tool for an empirical approach. Ten women from the Panatis location in northern Natal, Rio Grande do Norte, whose families live in precarious social-economical situations were interviewed. The interviews occurred between the months of April and June, 2007. The reports revealed that a mixture of improvisations and creativity was used as strategies for overcoming the privations and necessities of daily life. We also reached the conclusion that these families sought solutions for their problems through religiosity and a gift reciprocity system as resources for obtaining personal recognition and support in adversity. The results, in addition, point to ESF as one of the strategies used by these families in the search for attention and care. From this perspective, ESF has proven to be a place for listening and the construction of ties that are consolidated through home visits, organized groups, in parties and outings that are promoted in the community, reestablishing contact and support among people and signaling a way out of abandonment and isolation. Holders of knowledge constructed through life experiences, the participants of the study led us to induce and infer the need to amplify space that will allow them to express meanings, values and experiences, and consider that becoming ill is a process that incorporates dimensions of life that go beyond the physical. As health professionals, we need to be aware of the multiple and creative abilities used in the daily lives of these families, so that we can, along with them, reinvent a new way of dealing with health
Resumo:
Internet applications such as media streaming, collaborative computing and massive multiplayer are on the rise,. This leads to the need for multicast communication, but unfortunately group communications support based on IP multicast has not been widely adopted due to a combination of technical and non-technical problems. Therefore, a number of different application-layer multicast schemes have been proposed in recent literature to overcome the drawbacks. In addition, these applications often behave as both providers and clients of services, being called peer-topeer applications, and where participants come and go very dynamically. Thus, servercentric architectures for membership management have well-known problems related to scalability and fault-tolerance, and even peer-to-peer traditional solutions need to have some mechanism that takes into account member's volatility. The idea of location awareness distributes the participants in the overlay network according to their proximity in the underlying network allowing a better performance. Given this context, this thesis proposes an application layer multicast protocol, called LAALM, which takes into account the actual network topology in the assembly process of the overlay network. The membership algorithm uses a new metric, IPXY, to provide location awareness through the processing of local information, and it was implemented using a distributed shared and bi-directional tree. The algorithm also has a sub-optimal heuristic to minimize the cost of membership process. The protocol has been evaluated in two ways. First, through an own simulator developed in this work, where we evaluated the quality of distribution tree by metrics such as outdegree and path length. Second, reallife scenarios were built in the ns-3 network simulator where we evaluated the network protocol performance by metrics such as stress, stretch, time to first packet and reconfiguration group time
Resumo:
This paper describes a branch-and-price algorithm for the p-median location problem. The objective is to locate p facilities (medians) such as the sum of the distances from each demand point to its nearest facility is minimized. The traditional column generation process is compared with a stabilized approach that combines the column generation and Lagrangean/surrogate relaxation. The Lagrangean/surrogate multiplier modifies; the reduced cost criterion, providing the selection of new productive columns at the search tree. Computational experiments are conducted considering especially difficult instances to the traditional column generation and also with some large-scale instances. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
Cure kinetic model is an integral part of composite process simulation, which is used to predict the degree of curing and the amount of the generated heat. The parameters involved in kinetic models are usually determined empirically from isothermal or dynamic differential scanning calorimetry (DSC) data. In this work, DSC and rheological techniques were used to investigate some of the kinetic parameters of cure reactions of carbon/F161 epoxy prepreg and to evaluate the cure cycle used to manufacture polymeric composites for aeronautical applications. As a result, it was observed that the F161 prepreg presents cure kinetic with total order 1.2-1.9. (c) 2006 Springer Science + Business Media, Inc.
Resumo:
The purpose of this study was to investigate the social-environmental implications of the first large scale wind farm recently built in Brazil (2006), Parque Eólico de Rio do Fogo (PERF), to the nearby communities. The research was base on the adjustment of the DIS/BCN tool to analyze social impact and it was linked to the multi-method approach. Applying the autophotography strategy, cameras were given to five children from the district of Zumbi, the nearest location to PERF, and they were asked to individually photograph the six places they liked the most and the six places they liked the least in their community. Then, these children were interviewed individually and collectively about the photographs. Adult locals in Zumbi, residents of Zumbi/Rio do Fogo settlement, members of the State and Municipal government and representatives of the PERF were also interviewed with the aid of some of the pictures taken by the children and others that might trigger something to say, as a strategy called sample function. The five children presented positive image towards PERF; all of them chose to photograph it as one of places they liked. Adult population of Zumbi presented positive visual evaluation towards PERF. A small number of the interviewees were aware of the environmental and social benefits of wind energy production. Residents did not participate of the decision making process regarding PERF. They approved the project, especially because of the jobs provided during construction. Nowadays, PERF is something apart from their lives because it no longer provides jobs or any other interaction between the facility and the locals. Residents relate to the land, not with the facility. However, there is no evidence of rejection towards PERF, it is simply seen as something neutral to their lives. The low levels of education, traditional lack of social commitment and citizenship, and the experience accumulated by PERF´s planners and builders in other countries, may be contributing points to the fact that Zumbi residents did not oppose to PERF. It is clear that the country needs a legislation which seriously considers the psycho-social dimension involved in the implementation of wind farms
Resumo:
The conventional Newton and fast decoupled power flow (FDPF) methods have been considered inadequate to obtain the maximum loading point of power systems due to ill-conditioning problems at and near this critical point. It is well known that the PV and Q-theta decoupling assumptions of the fast decoupled power flow formulation no longer hold in the vicinity of the critical point. Moreover, the Jacobian matrix of the Newton method becomes singular at this point. However, the maximum loading point can be efficiently computed through parameterization techniques of continuation methods. In this paper it is shown that by using either theta or V as a parameter, the new fast decoupled power flow versions (XB and BX) become adequate for the computation of the maximum loading point only with a few small modifications. The possible use of reactive power injection in a selected PV bus (Q(PV)) as continuation parameter (mu) for the computation of the maximum loading point is also shown. A trivial secant predictor, the modified zero-order polynomial which uses the current solution and a fixed increment in the parameter (V, theta, or mu) as an estimate for the next solution, is used in predictor step. These new versions are compared to each other with the purpose of pointing out their features, as well as the influence of reactive power and transformer tap limits. The results obtained with the new approach for the IEEE test systems (14, 30, 57 and 118 buses) are presented and discussed in the companion paper. The results show that the characteristics of the conventional method are enhanced and the region of convergence around the singular solution is enlarged. In addition, it is shown that parameters can be switched during the tracing process in order to efficiently determine all the PV curve points with few iterations. (C) 2003 Elsevier B.V. All rights reserved.
Resumo:
Continuing development of new materials makes systems lighter and stronger permitting more complex systems to provide more functionality and flexibility that demands a more effective evaluation of their structural health. Smart material technology has become an area of increasing interest in this field. The combination of smart materials and artificial neural networks can be used as an excellent tool for pattern recognition, turning their application adequate for monitoring and fault classification of equipment and structures. In order to identify the fault, the neural network must be trained using a set of solutions to its corresponding forward Variational problem. After the training process, the net can successfully solve the inverse variational problem in the context of monitoring and fault detection because of their pattern recognition and interpolation capabilities. The use of structural frequency response function is a fundamental portion of structural dynamic analysis, and it can be extracted from measured electric impedance through the electromechanical interaction of a piezoceramic and a structure. In this paper we use the FRF obtained by a mathematical model (FEM) in order to generate the training data for the neural networks, and the identification of damage can be done by measuring electric impedance, since suitable data normalization correlates FRF and electrical impedance.
Resumo:
Bolted joints are a form of mechanical coupling largely used in machinery due to their reliability and low cost. Failure of bolted joints can lead to catastrophic events, such as leaking, train derailments, aircraft crashes, etc. Most of these failures occur due to the reduction of the pre-load, induced by mechanical vibration or human errors in the assembly or maintenance process. This article investigates the application of shape memory alloy (SMA) washers as an actuator to increase the pre-load on loosened bolted joints. The application of SMA washer follows a structural health monitoring procedure to identify a damage (reduction in pre-load) occurrence. In this article, a thermo-mechanical model is presented to predict the final pre-load achieved using this kind of actuator, based on the heat input and SMA washer dimension. This model extends and improves on the previous model of Ghorashi and Inman [2004, "Shape Memory Alloy in Tension and Compression and its Application as Clamping Force Actuator in a Bolted Joint: Part 2 - Modeling," J. Intell. Mater. Syst. Struct., 15:589-600], by eliminating the pre-load term related to nut turning making the system more practical. This complete model is a powerful but complex tool to be used by designers. A novel modeling approach for self-healing bolted joints based on curve fitting of experimental data is presented. The article concludes with an experimental application that leads to a change in joint assembly to increase the system reliability, by removing the ceramic washer component. Further research topics are also suggested.
Resumo:
Researches in Requirements Engineering have been growing in the latest few years. Researchers are concerned with a set of open issues such as: communication between several user profiles involved in software engineering; scope definition; volatility and traceability issues. To cope with these issues a set of works are concentrated in (i) defining processes to collect client s specifications in order to solve scope issues; (ii) defining models to represent requirements to address communication and traceability issues; and (iii) working on mechanisms and processes to be applied to requirements modeling in order to facilitate requirements evolution and maintenance, addressing volatility and traceability issues. We propose an iterative Model-Driven process to solve these issues, based on a double layered CIM to communicate requirements related knowledge to a wider amount of stakeholders. We also present a tool to help requirements engineer through the RE process. Finally we present a case study to illustrate the process and tool s benefits and usage
Resumo:
The use of middleware technology in various types of systems, in order to abstract low-level details related to the distribution of application logic, is increasingly common. Among several systems that can be benefited from using these components, we highlight the distributed systems, where it is necessary to allow communications between software components located on different physical machines. An important issue related to the communication between distributed components is the provision of mechanisms for managing the quality of service. This work presents a metamodel for modeling middlewares based on components in order to provide to an application the abstraction of a communication between components involved in a data stream, regardless their location. Another feature of the metamodel is the possibility of self-adaptation related to the communication mechanism, either by updating the values of its configuration parameters, or by its replacement by another mechanism, in case of the restrictions of quality of service specified are not being guaranteed. In this respect, it is planned the monitoring of the communication state (application of techniques like feedback control loop), analyzing performance metrics related. The paradigm of Model Driven Development was used to generate the implementation of a middleware that will serve as proof of concept of the metamodel, and the configuration and reconfiguration policies related to the dynamic adaptation processes. In this sense was defined the metamodel associated to the process of a communication configuration. The MDD application also corresponds to the definition of the following transformations: the architectural model of the middleware in Java code, and the configuration model to XML
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
We present a generic spatially explicit modeling framework to estimate carbon emissions from deforestation (INPE-EM). The framework incorporates the temporal dynamics related to the deforestation process and accounts for the biophysical and socioeconomic heterogeneity of the region under study. We build an emission model for the Brazilian Amazon combining annual maps of new clearings, four maps of biomass, and a set of alternative parameters based on the recent literature. The most important results are as follows: (a) Using different biomass maps leads to large differences in estimates of emission; for the entire region of the Brazilian Amazon in the last decade, emission estimates of primary forest deforestation range from 0.21 to 0.26 similar to Pg similar to C similar to yr-1. (b) Secondary vegetation growth presents a small impact on emission balance because of the short duration of secondary vegetation. In average, the balance is only 5% smaller than the primary forest deforestation emissions. (c) Deforestation rates decreased significantly in the Brazilian Amazon in recent years, from 27 similar to Mkm2 in 2004 to 7 similar to Mkm2 in 2010. INPE-EM process-based estimates reflect this decrease even though the agricultural frontier is moving to areas of higher biomass. The decrease is slower than a non-process instantaneous model would estimate as it considers residual emissions (slash, wood products, and secondary vegetation). The average balance, considering all biomass, decreases from 0.28 in 2004 to 0.15 similar to Pg similar to C similar to yr-1 in 2009; the non-process model estimates a decrease from 0.33 to 0.10 similar to Pg similar to C similar to yr-1. We conclude that the INPE-EM is a powerful tool for representing deforestation-driven carbon emissions. Biomass estimates are still the largest source of uncertainty in the effective use of this type of model for informing mechanisms such as REDD+. The results also indicate that efforts to reduce emissions should focus not only on controlling primary forest deforestation but also on creating incentives for the restoration of secondary forests.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)