948 resultados para distributed model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Remanufacturing is the process of rebuilding used products that ensures that the quality of remanufactured products is equivalent to that of new ones. Although the theme is gaining ground, it is still little explored due to lack of knowledge, the difficulty of visualizing it systemically, and implementing it effectively. Few models treat remanufacturing as a system. Most of the studies still treated remanufacturing as an isolated process, preventing it from being seen in an integrated manner. Therefore, the aim of this work is to organize the knowledge about remanufacturing, offering a vision of remanufacturing system and contributing to an integrated view about the theme. The methodology employed was a literature review, adopting the General Theory of Systems to characterize the remanufacturing system. This work consolidates and organizes the elements of this system, enabling a better understanding of remanufacturing and assisting companies in adopting the concept.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aspects related to the users' cooperative work are not considered in the traditional approach of software engineering, since the user is viewed independently of his/her workplace environment or group, with the individual model generalized to the study of collective behavior of all users. This work proposes a process for software requirements to address issues involving cooperative work in information systems that provide distributed coordination in the users' actions and the communication among them occurs indirectly through the data entered while using the software. To achieve this goal, this research uses ergonomics, the 3C cooperation model, awareness and software engineering concepts. Action-research is used as a research methodology applied in three cycles during the development of a corporate workflow system in a technological research company. This article discusses the third cycle, which corresponds to the process that deals with the refinement of the cooperative work requirements with the software in actual use in the workplace, where the inclusion of a computer system changes the users' workplace, from the face to face interaction to the interaction mediated by the software. The results showed that the highest degree of users' awareness about their activities and other system users contribute to a decrease in their errors and in the inappropriate use of the system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background Cell adhesion molecules (CAMs) are essential for maintaining tissue integrity by regulating intercellular and cell to extracellular matrix interactions. Cadherins and catenins are CAMs that are located on the cell membrane and are important for adherens junction (AJ) function. This study aims to verify if hypercholesterolemic diet (HCD) or bladder outlet obstruction (BOO) promotes structural bladder wall modifications specific to alterations in the expression of cadherins and catenins in detrusor muscle cells. Methods Forty-five 4-week-old female Wistar rats were divided into the following three groups: group 1 was a control group that was fed a normal diet (ND); group 2 was the BOO model and was fed a ND; and group 3 was a control group that was fed a HCD (1.25% cholesterol). Initially, serum cholesterol, LDL cholesterol and body weight were determined. Four weeks later, groups 1 and 3 underwent a sham operation; whereas group 2 underwent a partial BOO procedure that included a suture tied around the urethra. Six weeks later, all rats had their bladders removed, and previous exams were repeated. The expression levels of N-, P-, and E-cadherin, cadherin-11 and alpha-, beta- and gamma-catenins were evaluated by immunohistochemistry with a semiquantitative analysis. Results Wistar rats fed a HCD (group 3) exhibited a significant increase in LDL cholesterol levels (p=0.041) and body weight (p=0.017) when compared to both groups that were fed a normal diet in a ten-week period. We found higher β- and γ-catenin expression in groups 2 and 3 when compared to group 1 (p = 0.042 and p = 0.044, respectively). We also observed Cadherin-11 overexpression in group 3 when compared to groups 1 and 2 (p = 0.002). Conclusions A HCD in Wistar rats promoted, in addition to higher body weight gain and increased serum LDL cholesterol levels, overexpression of β- and γ-catenin in the detrusor muscle cells. Similar finding was observed in the BOO group. Higher Cadherin-11 expression was observed only in the HCD-treated rats. These findings may be associated with bladder dysfunctions that occur under such situations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background The criteria for organ sharing has developed a system that prioritizes liver transplantation (LT) for patients with hepatocellular carcinoma (HCC) who have the highest risk of wait-list mortality. In some countries this model allows patients only within the Milan Criteria (MC, defined by the presence of a single nodule up to 5 cm, up to three nodules none larger than 3 cm, with no evidence of extrahepatic spread or macrovascular invasion) to be evaluated for liver transplantation. This police implies that some patients with HCC slightly more advanced than those allowed by the current strict selection criteria will be excluded, even though LT for these patients might be associated with acceptable long-term outcomes. Methods We propose a mathematical approach to study the consequences of relaxing the MC for patients with HCC that do not comply with the current rules for inclusion in the transplantation candidate list. We consider overall 5-years survival rates compatible with the ones reported in the literature. We calculate the best strategy that would minimize the total mortality of the affected population, that is, the total number of people in both groups of HCC patients that die after 5 years of the implementation of the strategy, either by post-transplantation death or by death due to the basic HCC. We illustrate the above analysis with a simulation of a theoretical population of 1,500 HCC patients with tumor size exponentially. The parameter λ obtained from the literature was equal to 0.3. As the total number of patients in these real samples was 327 patients, this implied in an average size of 3.3 cm and a 95% confidence interval of [2.9; 3.7]. The total number of available livers to be grafted was assumed to be 500. Results With 1500 patients in the waiting list and 500 grafts available we simulated the total number of deaths in both transplanted and non-transplanted HCC patients after 5 years as a function of the tumor size of transplanted patients. The total number of deaths drops down monotonically with tumor size, reaching a minimum at size equals to 7 cm, increasing from thereafter. With tumor size equals to 10 cm the total mortality is equal to the 5 cm threshold of the Milan criteria. Conclusion We concluded that it is possible to include patients with tumor size up to 10 cm without increasing the total mortality of this population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background Over the last years, a number of researchers have investigated how to improve the reuse of crosscutting concerns. New possibilities have emerged with the advent of aspect-oriented programming, and many frameworks were designed considering the abstractions provided by this new paradigm. We call this type of framework Crosscutting Frameworks (CF), as it usually encapsulates a generic and abstract design of one crosscutting concern. However, most of the proposed CFs employ white-box strategies in their reuse process, requiring two mainly technical skills: (i) knowing syntax details of the programming language employed to build the framework and (ii) being aware of the architectural details of the CF and its internal nomenclature. Also, another problem is that the reuse process can only be initiated as soon as the development process reaches the implementation phase, preventing it from starting earlier. Method In order to solve these problems, we present in this paper a model-based approach for reusing CFs which shields application engineers from technical details, letting him/her concentrate on what the framework really needs from the application under development. To support our approach, two models are proposed: the Reuse Requirements Model (RRM) and the Reuse Model (RM). The former must be used to describe the framework structure and the later is in charge of supporting the reuse process. As soon as the application engineer has filled in the RM, the reuse code can be automatically generated. Results We also present here the result of two comparative experiments using two versions of a Persistence CF: the original one, whose reuse process is based on writing code, and the new one, which is model-based. The first experiment evaluated the productivity during the reuse process, and the second one evaluated the effort of maintaining applications developed with both CF versions. The results show the improvement of 97% in the productivity; however little difference was perceived regarding the effort for maintaining the required application. Conclusion By using the approach herein presented, it was possible to conclude the following: (i) it is possible to automate the instantiation of CFs, and (ii) the productivity of developers are improved as long as they use a model-based instantiation approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN] The General Curvilinear Environmental Model is a high-resolution system composed of the General Curvilinear Coastal Ocean Model (GCCOM) and the General Curvilinear Atmospheric Model (GCAM). Both modules are capable of reading a general curvilinear grid, orthogonal as well as non-orthogonal in all three directions. These two modules are weakly coupled using the distributed coupling toolkit (DCT). The model can also be nested within larger models and users are able to interact with the model and run it using a web based computational environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN] In this paper, we have used Geographical Information Systems (GIS) to solve the planar Huff problem considering different demand distributions and forbidden regions. Most of the papers connected with the competitive location problems consider that the demand is aggregated in a finite set of points. In other few cases, the models suppose that the demand is distributed along the feasible region according to a functional form, mainly a uniform distribution. In this case, in addition to the discrete and uniform demand distributions we have considered that the demand is represented by a population surface model, that is, a raster map where each pixel has associated a value corresponding to the population living in the area that it covers...

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes modelling tools and methods suited for complex systems (systems that typically are represented by a plurality of models). The basic idea is that all models representing the system should be linked by well-defined model operations in order to build a structured repository of information, a hierarchy of models. The port-Hamiltonian framework is a good candidate to solve this kind of problems as it supports the most important model operations natively. The thesis in particular addresses the problem of integrating distributed parameter systems in a model hierarchy, and shows two possible mechanisms to do that: a finite-element discretization in port-Hamiltonian form, and a structure-preserving model order reduction for discretized models obtainable from commercial finite-element packages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the thesi is to formulate a suitable Item Response Theory (IRT) based model to measure HRQoL (as latent variable) using a mixed responses questionnaire and relaxing the hypothesis of normal distributed latent variable. The new model is a combination of two models already presented in literature, that is, a latent trait model for mixed responses and an IRT model for Skew Normal latent variable. It is developed in a Bayesian framework, a Markov chain Monte Carlo procedure is used to generate samples of the posterior distribution of the parameters of interest. The proposed model is test on a questionnaire composed by 5 discrete items and one continuous to measure HRQoL in children, the EQ-5D-Y questionnaire. A large sample of children collected in the schools was used. In comparison with a model for only discrete responses and a model for mixed responses and normal latent variable, the new model has better performances, in term of deviance information criterion (DIC), chain convergences times and precision of the estimates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The wide diffusion of cheap, small, and portable sensors integrated in an unprecedented large variety of devices and the availability of almost ubiquitous Internet connectivity make it possible to collect an unprecedented amount of real time information about the environment we live in. These data streams, if properly and timely analyzed, can be exploited to build new intelligent and pervasive services that have the potential of improving people's quality of life in a variety of cross concerning domains such as entertainment, health-care, or energy management. The large heterogeneity of application domains, however, calls for a middleware-level infrastructure that can effectively support their different quality requirements. In this thesis we study the challenges related to the provisioning of differentiated quality-of-service (QoS) during the processing of data streams produced in pervasive environments. We analyze the trade-offs between guaranteed quality, cost, and scalability in streams distribution and processing by surveying existing state-of-the-art solutions and identifying and exploring their weaknesses. We propose an original model for QoS-centric distributed stream processing in data centers and we present Quasit, its prototype implementation offering a scalable and extensible platform that can be used by researchers to implement and validate novel QoS-enforcement mechanisms. To support our study, we also explore an original class of weaker quality guarantees that can reduce costs when application semantics do not require strict quality enforcement. We validate the effectiveness of this idea in a practical use-case scenario that investigates partial fault-tolerance policies in stream processing by performing a large experimental study on the prototype of our novel LAAR dynamic replication technique. Our modeling, prototyping, and experimental work demonstrates that, by providing data distribution and processing middleware with application-level knowledge of the different quality requirements associated to different pervasive data flows, it is possible to improve system scalability while reducing costs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The distribution pattern of European arctic-alpine disjunct species is of growing interest among biogeographers due to the arising variety of inferred demographic histories. In this thesis I used the co-distributed mayfly Ameletus inopinatus and the stonefly Arcynopteryx compacta as model species to investigate the European Pleistocene and Holocene history of stream-inhabiting arctic-alpine aquatic insects. I used last glacial maximum (LGM) species distribution models (SDM) to derive hypotheses on the glacial survival during the LGM and the recolonization of Fennoscandia: 1) both species potentially survived glacial cycles in periglacial, extra Mediterranean refugia, and 2) postglacial recolonization of Fennoscandia originated from these refugia. I tested these hypotheses using mitochondrial sequence (mtCOI) and species specific microsatellite data. Additionally, I used future SDM to predict the impact of climate change induced range shifts and habitat loss on the overall genetic diversity of the endangered mayfly A. inopinatus.rnI observed old lineages, deep splits, and almost complete lineage sorting of mtCOI sequences between mountain ranges. These results support the hypothesis that both species persisted in multiple periglacial extra-Mediterranean refugia in Central Europe during the LGM. However, the recolonization of Fennoscandia was very different between the two study species. For the mayfly A. inopinatus I found strong differentiation between the Fennoscandian and all other populations in sequence and microsatellite data, indicating that Fennoscandia was recolonized from an extra European refugium. High mtCOI genetic structure within Fennoscandia supports a recolonization of multiple lineages from independent refugia. However, this structure was not apparent in the microsatellite data, consistent with secondary contact without sexual incompability. In contrast, the stonefly A. compacta exhibited low genetic structure and shared mtCOI haplotypes among Fennoscandia and the Black Forest, suggesting a shared Pleistocene refugium in the periglacial tundrabelt. Again, there is incongruence with the microsatellite data, which could be explained with ancestral polymorphism or female-biased dispersal. Future SDM projects major regional habitat loss for the mayfly A. inopinatus, particularly in Central European mountain ranges. By relating these range shifts to my population genetic results, I identified conservation units primarily in Eastern Europe, that if preserved would maintain high levels of the present-day genetic diversity of A. inopinatus and continue to provide long-term suitable habitat under future climate warming scenarios.rnIn this thesis I show that despite similar present day distributions the underlying demographic histories of the study species are vastly different, which might be due to differing dispersal capabilities and niche plasticity. I present genetic, climatic, and ecological data that can be used to prioritize conservation efforts for cold-adapted freshwater insects in light of future climate change. Overall, this thesis provides a next step in filling the knowledge gap regarding molecular studies of the arctic-alpine invertebrate fauna. However, there is continued need to explore the phenomenon of arctic-alpine disjunctions to help understand the processes of range expansion, regression, and lineage diversification in Europe’s high latitude and high altitude biota.