985 resultados para integration methods
Resumo:
"COO-2383-0035."
Resumo:
The main drivers for the development and evolution of Cyber Physical Systems (CPS) are the reduction of development costs and time along with the enhancement of the designed products. The aim of this survey paper is to provide an overview of different types of system and the associated transition process from mechatronics to CPS and cloud-based (IoT) systems. It will further consider the requirement that methodologies for CPS-design should be part of a multi-disciplinary development process within which designers should focus not only on the separate physical and computational components, but also on their integration and interaction. Challenges related to CPS-design are therefore considered in the paper from the perspectives of the physical processes, computation and integration respectively. Illustrative case studies are selected from different system levels starting with the description of the overlaying concept of Cyber Physical Production Systems (CPPSs). The analysis and evaluation of the specific properties of a sub-system using a condition monitoring system, important for the maintenance purposes, is then given for a wind turbine.
Resumo:
In medicine, innovation depends on a better knowledge of the human body mechanism, which represents a complex system of multi-scale constituents. Unraveling the complexity underneath diseases proves to be challenging. A deep understanding of the inner workings comes with dealing with many heterogeneous information. Exploring the molecular status and the organization of genes, proteins, metabolites provides insights on what is driving a disease, from aggressiveness to curability. Molecular constituents, however, are only the building blocks of the human body and cannot currently tell the whole story of diseases. This is why nowadays attention is growing towards the contemporary exploitation of multi-scale information. Holistic methods are then drawing interest to address the problem of integrating heterogeneous data. The heterogeneity may derive from the diversity across data types and from the diversity within diseases. Here, four studies conducted data integration using customly designed workflows that implement novel methods and views to tackle the heterogeneous characterization of diseases. The first study devoted to determine shared gene regulatory signatures for onco-hematology and it showed partial co-regulation across blood-related diseases. The second study focused on Acute Myeloid Leukemia and refined the unsupervised integration of genomic alterations, which turned out to better resemble clinical practice. In the third study, network integration for artherosclerosis demonstrated, as a proof of concept, the impact of network intelligibility when it comes to model heterogeneous data, which showed to accelerate the identification of new potential pharmaceutical targets. Lastly, the fourth study introduced a new method to integrate multiple data types in a unique latent heterogeneous-representation that facilitated the selection of important data types to predict the tumour stage of invasive ductal carcinoma. The results of these four studies laid the groundwork to ease the detection of new biomarkers ultimately beneficial to medical practice and to the ever-growing field of Personalized Medicine.
Resumo:
Background: High-density tiling arrays and new sequencing technologies are generating rapidly increasing volumes of transcriptome and protein-DNA interaction data. Visualization and exploration of this data is critical to understanding the regulatory logic encoded in the genome by which the cell dynamically affects its physiology and interacts with its environment. Results: The Gaggle Genome Browser is a cross-platform desktop program for interactively visualizing high-throughput data in the context of the genome. Important features include dynamic panning and zooming, keyword search and open interoperability through the Gaggle framework. Users may bookmark locations on the genome with descriptive annotations and share these bookmarks with other users. The program handles large sets of user-generated data using an in-process database and leverages the facilities of SQL and the R environment for importing and manipulating data. A key aspect of the Gaggle Genome Browser is interoperability. By connecting to the Gaggle framework, the genome browser joins a suite of interconnected bioinformatics tools for analysis and visualization with connectivity to major public repositories of sequences, interactions and pathways. To this flexible environment for exploring and combining data, the Gaggle Genome Browser adds the ability to visualize diverse types of data in relation to its coordinates on the genome. Conclusions: Genomic coordinates function as a common key by which disparate biological data types can be related to one another. In the Gaggle Genome Browser, heterogeneous data are joined by their location on the genome to create information-rich visualizations yielding insight into genome organization, transcription and its regulation and, ultimately, a better understanding of the mechanisms that enable the cell to dynamically respond to its environment.
Resumo:
It has been demonstrated that laser induced breakdown spectrometry (LIBS) can be used as an alternative method for the determination of macro (P, K. Ca, Mg) and micronutrients (B, Fe, Cu, Mn, Zn) in pellets of plant materials. However, information is required regarding the sample preparation for plant analysis by LIBS. In this work, methods involving cryogenic grinding and planetary ball milling were evaluated for leaves comminution before pellets preparation. The particle sizes were associated to chemical sample properties such as fiber and cellulose contents, as well as to pellets porosity and density. The pellets were ablated at 30 different sites by applying 25 laser pulses per site (Nd:YAG@1064 nm, 5 ns, 10 Hz, 25J cm(-2)). The plasma emission collected by lenses was directed through an optical fiber towards a high resolution echelle spectrometer equipped with an ICCD. Delay time and integration time gate were fixed at 2.0 and 4.5 mu s, respectively. Experiments carried out with pellets of sugarcane, orange tree and soy leaves showed a significant effect of the plant species for choosing the most appropriate grinding conditions. By using ball milling with agate materials, 20 min grinding for orange tree and soy, and 60 min for sugarcane leaves led to particle size distributions generally lower than 75 mu m. Cryogenic grinding yielded similar particle size distributions after 10 min for orange tree, 20 min for soy and 30 min for sugarcane leaves. There was up to 50% emission signal enhancement on LIBS measurements for most elements by improving particle size distribution and consequently the pellet porosity. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Understanding the product`s `end-of-life` is important to reduce the environmental impact of the products` final disposal. When the initial stages of product development consider end-of-life aspects, which can be established by ecodesign (a proactive approach of environmental management that aims to reduce the total environmental impact of products), it becomes easier to close the loop of materials. The `end-of-life` ecodesign methods generally include more than one `end-of-life` strategy. Since product complexity varies substantially, some components, systems or sub-systems are easier to be recycled, reused or remanufactured than others. Remanufacture is an effective way to maintain products in a closed-loop, reducing both environmental impacts and costs of the manufacturing processes. This paper presents some ecodesign methods focused on the integration of different `end-of-life` strategies, with special attention to remanufacturing, given its increasing importance in the international scenario to reduce the life cycle impacts of products. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Here, we study the stable integration of real time optimization (RTO) with model predictive control (MPC) in a three layer structure. The intermediate layer is a quadratic programming whose objective is to compute reachable targets to the MPC layer that lie at the minimum distance to the optimum set points that are produced by the RTO layer. The lower layer is an infinite horizon MPC with guaranteed stability with additional constraints that force the feasibility and convergence of the target calculation layer. It is also considered the case in which there is polytopic uncertainty in the steady state model considered in the target calculation. The dynamic part of the MPC model is also considered unknown but it is assumed to be represented by one of the models of a discrete set of models. The efficiency of the methods presented here is illustrated with the simulation of a low order system. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Taking functional programming to its extremities in search of simplicity still requires integration with other development (e.g. formal) methods. Induction is the key to deriving and verifying functional programs, but can be simplified through packaging proofs with functions, particularly folds, on data (structures). Totally Functional Programming avoids the complexities of interpretation by directly representing data (structures) as platonic combinators - the functions characteristic to the data. The link between the two simplifications is that platonic combinators are a kind of partially-applied fold, which means that platonic combinators inherit fold-theoretic properties, but with some apparent simplifications due to the platonic combinator representation. However, despite observable behaviour within functional programming that suggests that TFP is widely-applicable, significant work remains before TFP as such could be widely adopted.
Resumo:
This article deals with the efficiency of fractional integration parameter estimators. This study was based on Monte Carlo experiments involving simulated stochastic processes with integration orders in the range]-1,1[. The evaluated estimation methods were classified into two groups: heuristics and semiparametric/maximum likelihood (ML). The study revealed that the comparative efficiency of the estimators, measured by the lesser mean squared error, depends on the stationary/non-stationary and persistency/anti-persistency conditions of the series. The ML estimator was shown to be superior for stationary persistent processes; the wavelet spectrum-based estimators were better for non-stationary mean reversible and invertible anti-persistent processes; the weighted periodogram-based estimator was shown to be superior for non-invertible anti-persistent processes.
Resumo:
Introduction. A fundamental aspect of planning future actions is the performance and control of motor tasks. This behaviour is done through sensory-motor integration. Aim. To explain the electrophysiological mechanisms in the cortex (modifications to the alpha band) that are involved in anticipatory actions when individuals have to catch a free-falling object. Subjects and methods. The sample was made up of 20 healthy subjects of both sexes (11 males and 9 females) with ages ranging between 25 and 40 years (32.5 +/- 7.5) who were free of mental or physical diseases (previous medical history); the subjects were right-handed (Edinburgh Inventory) and were not taking any psychoactive or psychotropic substances at the time of the study. The experiment consisted in a task in which subjects had to catch freely falling objects. The experiment was made up of six blocks of 15 tests, each of which lasted 2 minutes and 30 seconds before and two seconds after each ball was dropped. Results. An interaction of the factors moment and position was only observed for the right parietooccipital cortex, in the combination of electrodes P4-O2. Conclusion. These findings suggest that the right parietooccipital cortex plays an important role in increasing expectation and swiftness in the process of preparing for a motor task.
Resumo:
This paper presents a method of formally specifying, refining and verifying concurrent systems which uses the object-oriented state-based specification language Object-Z together with the process algebra CSP. Object-Z provides a convenient way of modelling complex data structures needed to define the component processes of such systems, and CSP enables the concise specification of process interactions. The basis of the integration is a semantics of Object-Z classes identical to that of CSP processes. This allows classes specified in Object-Z to he used directly within the CSP part of the specification. In addition to specification, we also discuss refinement and verification in this model. The common semantic basis enables a unified method of refinement to be used, based upon CSP refinement. To enable state-based techniques to be used fur the Object-Z components of a specification we develop state-based refinement relations which are sound and complete with respect to CSP refinement. In addition, a verification method for static and dynamic properties is presented. The method allows us to verify properties of the CSP system specification in terms of its component Object-Z classes by using the laws of the the CSP operators together with the logic for Object-Z.
Resumo:
Nowadays, there exist various standards for individual management systems (MSs), at least, one for each stakeholder. New ones will be published. An integrated management system (IMS) aims to integrate some or all components of the business into one coherent and efficient MS. Maximizing integration is more and more a strategic priority in that it constitutes an opportunity to eliminate and/or reduce potential factors of destruction of value for the organizations and also to be more competitive and consequently promote its sustainable success. A preliminary investigation was conducted on a Portuguese industrial company which, over the years, has been adopting gradually, in whole or in part, individualized management system standards (MSSs). A research, through a questionnaire, was performed with the objective to develop, in a real business environment, an adequate and efficient IMS-QES (quality, environment, and safety) model and to potentiate for the future a generic IMS model to integrate other MSSs. The strategy and research methods have taken into consideration the case study. It was obtained a set of relevant conclusions resulting from the statistical analyses of the responses to the survey. Globally, the investigation results, by themselves, justified and prioritized the conception of a model of development of the IMS-QES and consequent definition and validation of a structure of an IMS-QES model, to be implemented at the small- and medium-sized enterprise (SME) where the investigation was conducted.
Resumo:
INTRODUCTION: The aim of this study was to assess the epidemiological and operational characteristics of the Leprosy Program before and after its integration into the Primary healthcare Services of the municipality of Aracaju-Sergipe, Brazil. METHODS: Data were drawn from the national database. The study periods were divided into preintegration (1996-2000) and postintegration (2001-2007). Annual rates of epidemiological detection were calculated. Frequency data on clinico-epidemiological variables of cases detected and treated for the two periods were compared using the Chi-squared (χ2) test adopting a 5% level of significance. RESULTS: Rates of detection overall, and in subjects younger than 15 years, were greater for the postintegration period and were higher than rates recorded for Brazil as a whole during the same periods. A total of 780 and 1,469 cases were registered during the preintegration and postintegration periods, respectively. Observations for the postintegration period were as follows: I) a higher proportion of cases with disability grade assessed at diagnosis, with increase of 60.9% to 78.8% (p < 0.001), and at end of treatment, from 41.4% to 44.4% (p < 0.023); II) an increase in proportion of cases detected by contact examination, from 2.1% to 4.1% (p < 0.001); and III) a lower level of treatment default with a decrease from 5.64 to 3.35 (p < 0.008). Only 34% of cases registered from 2001 to 2007 were examined. CONCLUSIONS: The shift observed in rates of detection overall, and in subjects younger than 15 years, during the postintegration period indicate an increased level of health care access. The fall in number of patients abandoning treatment indicates greater adherence to treatment. However, previous shortcomings in key actions, pivotal to attaining the outcomes and impact envisaged for the program, persisted in the postintegration period.
Resumo:
Introduction The association between leprosy and pregnancy is currently poorly understood and has been linked to serious clinical consequences. Methods A retrospective study between 2007 and 2009 was performed in the integration region of Carajás, Brazil on a population of pregnant lepers, with non-lepers of ages 12-49 years serving as the reference population. Results Twenty-nine pregnant lepers were studied during the study period. The detection rates (DRs) for the studied association were 4.7 in 2007, 9.4 in 2008, and 4.3 in 2009. Conclusions The Carajás region presented a medium pattern of endemicity during most of the study period, with a high DR found in 2008.
Resumo:
This paper aims at developing a collision prediction model for three-leg junctions located in national roads (NR) in Northern Portugal. The focus is to identify factors that contribute for collision type crashes in those locations, mainly factors related to road geometric consistency, since literature is scarce on those, and to research the impact of three modeling methods: generalized estimating equations, random-effects negative binomial models and random-parameters negative binomial models, on the factors of those models. The database used included data published between 2008 and 2010 of 177 three-leg junctions. It was split in three groups of contributing factors which were tested sequentially for each of the adopted models: at first only traffic, then, traffic and the geometric characteristics of the junctions within their area of influence; and, lastly, factors which show the difference between the geometric characteristics of the segments boarding the junctionsâ area of influence and the segment included in that area were added. The choice of the best modeling technique was supported by the result of a cross validation made to ascertain the best model for the three sets of researched contributing factors. The models fitted with random-parameters negative binomial models had the best performance in the process. In the best models obtained for every modeling technique, the characteristics of the road environment, including proxy measures for the geometric consistency, along with traffic volume, contribute significantly to the number of collisions. Both the variables concerning junctions and the various national highway segments in their area of influence, as well as variations from those characteristics concerning roadway segments which border the already mentioned area of influence have proven their relevance and, therefore, there is a rightful need to incorporate the effect of geometric consistency in the three-leg junctions safety studies.