838 resultados para Multiple methods framework
Resumo:
This keynote presentation will report some of our research work and experience on the development and applications of relevant methods, models, systems and simulation techniques in support of different types and various levels of decision making for business, management and engineering. In particular, the following topics will be covered. Modelling, multi-agent-based simulation and analysis of the allocation management of carbon dioxide emission permits in China (Nanfeng Liu & Shuliang Li Agent-based simulation of the dynamic evolution of enterprise carbon assets (Yin Zeng & Shuliang Li) A framework & system for extracting and representing project knowledge contexts using topic models and dynamic knowledge maps: a big data perspective (Jin Xu, Zheng Li, Shuliang Li & Yanyan Zhang) Open innovation: intelligent model, social media & complex adaptive system simulation (Shuliang Li & Jim Zheng Li) A framework, model and software prototype for modelling and simulation for deshopping behaviour and how companies respond (Shawkat Rahman & Shuliang Li) Integrating multiple agents, simulation, knowledge bases and fuzzy logic for international marketing decision making (Shuliang Li & Jim Zheng Li) A Web-based hybrid intelligent system for combined conventional, digital, mobile, social media and mobile marketing strategy formulation (Shuliang Li & Jim Zheng Li) A hybrid intelligent model for Web & social media dynamics, and evolutionary and adaptive branding (Shuliang Li) A hybrid paradigm for modelling, simulation and analysis of brand virality in social media (Shuliang Li & Jim Zheng Li) Network configuration management: attack paradigms and architectures for computer network survivability (Tero Karvinen & Shuliang Li)
Resumo:
Variability management is one of the major challenges in software product line adoption, since it needs to be efficiently managed at various levels of the software product line development process (e.g., requirement analysis, design, implementation, etc.). One of the main challenges within variability management is the handling and effective visualization of large-scale (industry-size) models, which in many projects, can reach the order of thousands, along with the dependency relationships that exist among them. These have raised many concerns regarding the scalability of current variability management tools and techniques and their lack of industrial adoption. To address the scalability issues, this work employed a combination of quantitative and qualitative research methods to identify the reasons behind the limited scalability of existing variability management tools and techniques. In addition to producing a comprehensive catalogue of existing tools, the outcome form this stage helped understand the major limitations of existing tools. Based on the findings, a novel approach was created for managing variability that employed two main principles for supporting scalability. First, the separation-of-concerns principle was employed by creating multiple views of variability models to alleviate information overload. Second, hyperbolic trees were used to visualise models (compared to Euclidian space trees traditionally used). The result was an approach that can represent models encompassing hundreds of variability points and complex relationships. These concepts were demonstrated by implementing them in an existing variability management tool and using it to model a real-life product line with over a thousand variability points. Finally, in order to assess the work, an evaluation framework was designed based on various established usability assessment best practices and standards. The framework was then used with several case studies to benchmark the performance of this work against other existing tools.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
It is nowadays recognized that the risk of human co-exposure to multiple mycotoxins is real. In the last years, a number of studies have approached the issue of co-exposure and the best way to develop a more precise and realistic assessment. Likewise, the growing concern about the combined effects of mycotoxins and their potential impact on human health has been reflected by the increasing number of toxicological studies on the combined toxicity of these compounds. Nevertheless, risk assessment of these toxins, still follows the conventional paradigm of single exposure and single effects, incorporating only the possibility of additivity but not taking into account the complex dynamics associated to interactions between different mycotoxins or between mycotoxins and other food contaminants. Considering that risk assessment is intimately related to the establishment of regulatory guidelines, once the risk assessment is completed, an effort to reduce or manage the risk should be followed to protect public health. Risk assessment of combined human exposure to multiple mycotoxins thus poses several challenges to scientists, risk assessors and risk managers and opens new avenues for research. This presentation aims to give an overview of the different challenges posed by the likelihood of human co-exposure to mycotoxins and the possibility of interactive effects occurring after absorption, towards knowledge generation to support a more accurate human risk assessment and risk management. For this purpose, a physiologically-based framework that includes knowledge on the bioaccessibility, toxicokinetics and toxicodynamics of multiple toxins is proposed. Regarding exposure assessment, the need of harmonized food consumption data, availability of multianalyte methods for mycotoxin quantification, management of left-censored data and use of probabilistic models will be highlight, in order to develop a more precise and realistic exposure assessment. On the other hand, the application of predictive mathematical models to estimate mycotoxins’ combined effects from in vitro toxicity studies will be also discussed. Results from a recent Portuguese project aimed at exploring the toxic effects of mixtures of mycotoxins in infant foods and their potential health impact will be presented as a case study, illustrating the different aspects of risk assessment highlighted in this presentation. Further studies on hazard and exposure assessment of multiple mycotoxins, using harmonized approaches and methodologies, will be crucial towards an improvement in data quality and contributing to holistic risk assessment and risk management strategies for multiple mycotoxins in foodstuffs.
Resumo:
Reliability and dependability modeling can be employed during many stages of analysis of a computing system to gain insights into its critical behaviors. To provide useful results, realistic models of systems are often necessarily large and complex. Numerical analysis of these models presents a formidable challenge because the sizes of their state-space descriptions grow exponentially in proportion to the sizes of the models. On the other hand, simulation of the models requires analysis of many trajectories in order to compute statistically correct solutions. This dissertation presents a novel framework for performing both numerical analysis and simulation. The new numerical approach computes bounds on the solutions of transient measures in large continuous-time Markov chains (CTMCs). It extends existing path-based and uniformization-based methods by identifying sets of paths that are equivalent with respect to a reward measure and related to one another via a simple structural relationship. This relationship makes it possible for the approach to explore multiple paths at the same time,· thus significantly increasing the number of paths that can be explored in a given amount of time. Furthermore, the use of a structured representation for the state space and the direct computation of the desired reward measure (without ever storing the solution vector) allow it to analyze very large models using a very small amount of storage. Often, path-based techniques must compute many paths to obtain tight bounds. In addition to presenting the basic path-based approach, we also present algorithms for computing more paths and tighter bounds quickly. One resulting approach is based on the concept of path composition whereby precomputed subpaths are composed to compute the whole paths efficiently. Another approach is based on selecting important paths (among a set of many paths) for evaluation. Many path-based techniques suffer from having to evaluate many (unimportant) paths. Evaluating the important ones helps to compute tight bounds efficiently and quickly.
Resumo:
Conservation of the seven lagoons of the Palavas complex (southern France) has been severely impaired by nutrient over-enrichment during at least four decades. The effluents of the Montpellier wastewater treatment plant (WWTP) represented the main nutrient input. To improve the water quality of these lagoons, this WWTP was renovated and upgraded and, since the end of 2005, its effluents have been discharged 11 km offshore into the Mediterranean (total investment €150 M). Possibilities of ecosystem restoration as part of a conservation programme were explored by a focus group of experts. Their tasks were: (i) to evaluate the impact of the reduction of the nutrient input; (ii) if necessary, to design additional measures for an active restoration programme; and (iii) to predict ecosystem trajectories for the different cases. Extension of Magnoliophyta meadows can be taken as a proxy for ecosystem restoration as they favour the increase of several fish (seahorse) and bird (ducks, swans, herons) species, albeit they represent a trade-off for greater flamingos. Additional measures for active ecosystem restoration were only recommended for the most impaired lagoon Méjean, while the least impaired lagoon Ingril is already on a trajectory of spontaneous recovery. A multiple contingent valuation considering four different management options for the Méjean lagoon was used in a pilot study based on face-to-face interviews with 159 respondents. Three levels of ecosystem restoration were expressed in terms of recovery of Magnoliophyta meadows, including their impact on emblematic fish and avifauna. These were combined with different options for access (status quo, increasing access, increasing access with measures to reduce disturbance). The results show a willingness of local populations to pay per year about €25 for the highest level of ecological restoration, while they were only willing to allocate about €5 for additional footpaths and hides.
Resumo:
Part 5: Service Orientation in Collaborative Networks
Resumo:
Increasing in resolution of numerical weather prediction models has allowed more and more realistic forecasts of atmospheric parameters. Due to the growing variability into predicted fields the traditional verification methods are not always able to describe the model ability because they are based on a grid-point-by-grid-point matching between observation and prediction. Recently, new spatial verification methods have been developed with the aim of show the benefit associated to the high resolution forecast. Nested in among of the MesoVICT international project, the initially aim of this work is to compare the newly tecniques remarking advantages and disadvantages. First of all, the MesoVICT basic examples, represented by synthetic precipitation fields, have been examined. Giving an error evaluation in terms of structure, amplitude and localization of the precipitation fields, the SAL method has been studied more thoroughly respect to the others approaches with its implementation in the core cases of the project. The verification procedure has concerned precipitation fields over central Europe: comparisons between the forecasts performed by the 00z COSMO-2 model and the VERA (Vienna Enhanced Resolution Analysis) have been done. The study of these cases has shown some weaknesses of the methodology examined; in particular has been highlighted the presence of a correlation between the optimal domain size and the extention of the precipitation systems. In order to increase ability of SAL, a subdivision of the original domain in three subdomains has been done and the method has been applied again. Some limits have been found in cases in which at least one of the two domains does not show precipitation. The overall results for the subdomains have been summarized on scatter plots. With the aim to identify systematic errors of the model the variability of the three parameters has been studied for each subdomain.
Resumo:
Process systems design, operation and synthesis problems under uncertainty can readily be formulated as two-stage stochastic mixed-integer linear and nonlinear (nonconvex) programming (MILP and MINLP) problems. These problems, with a scenario based formulation, lead to large-scale MILPs/MINLPs that are well structured. The first part of the thesis proposes a new finitely convergent cross decomposition method (CD), where Benders decomposition (BD) and Dantzig-Wolfe decomposition (DWD) are combined in a unified framework to improve the solution of scenario based two-stage stochastic MILPs. This method alternates between DWD iterations and BD iterations, where DWD restricted master problems and BD primal problems yield a sequence of upper bounds, and BD relaxed master problems yield a sequence of lower bounds. A variant of CD, which includes multiple columns per iteration of DW restricted master problem and multiple cuts per iteration of BD relaxed master problem, called multicolumn-multicut CD is then developed to improve solution time. Finally, an extended cross decomposition method (ECD) for solving two-stage stochastic programs with risk constraints is proposed. In this approach, a CD approach at the first level and DWD at a second level is used to solve the original problem to optimality. ECD has a computational advantage over a bilevel decomposition strategy or solving the monolith problem using an MILP solver. The second part of the thesis develops a joint decomposition approach combining Lagrangian decomposition (LD) and generalized Benders decomposition (GBD), to efficiently solve stochastic mixed-integer nonlinear nonconvex programming problems to global optimality, without the need for explicit branch and bound search. In this approach, LD subproblems and GBD subproblems are systematically solved in a single framework. The relaxed master problem obtained from the reformulation of the original problem, is solved only when necessary. A convexification of the relaxed master problem and a domain reduction procedure are integrated into the decomposition framework to improve solution efficiency. Using case studies taken from renewable resource and fossil-fuel based application in process systems engineering, it can be seen that these novel decomposition approaches have significant benefit over classical decomposition methods and state-of-the-art MILP/MINLP global optimization solvers.
Resumo:
Dynamical models of stellar systems represent a powerful tool to study their internal structure and dynamics, to interpret the observed morphological and kinematical fields, and also to support numerical simulations of their evolution. We present a method especially designed to build axisymmetric Jeans models of galaxies, assumed as stationary and collisionless stellar systems. The aim is the development of a rigorous and flexible modelling procedure of multicomponent galaxies, composed of different stellar and dark matter distributions, and a central supermassive black hole. The stellar components, in particular, are intended to represent different galaxy structures, such as discs, bulges, halos, and can then have different structural (density profile, flattening, mass, scale-length), dynamical (rotation, velocity dispersion anisotropy), and population (age, metallicity, initial mass function, mass-to-light ratio) properties. The theoretical framework supporting the modelling procedure is presented, with the introduction of a suitable nomenclature, and its numerical implementation is discussed, with particular reference to the numerical code JASMINE2, developed for this purpose. We propose an approach for efficiently scaling the contributions in mass, luminosity, and rotational support, of the different matter components, allowing for fast and flexible explorations of the model parameter space. We also offer different methods of the computation of the gravitational potentials associated of the density components, especially convenient for their easier numerical tractability. A few galaxy models are studied, showing internal, and projected, structural and dynamical properties of multicomponent galaxies, with a focus on axisymmetric early-type galaxies with complex kinematical morphologies. The application of galaxy models to the study of initial conditions for hydro-dynamical and $N$-body simulations of galaxy evolution is also addressed, allowing in particular to investigate the large number of interesting combinations of the parameters which determine the structure and dynamics of complex multicomponent stellar systems.
Resumo:
This dissertation aims at developing advanced analytical tools able to model surface waves propagating in elastic metasurfaces. In particular, four different objectives are defined and pursued throughout this work to enrich the description of the metasurface dynamics. First, a theoretical framework is developed to describe the dispersion properties of a seismic metasurface composed of discrete resonators placed on a porous medium considering part of it fully saturated. Such a model combines classical elasticity theory, Biot’s poroelasticity and an effective medium approach to describe the metasurface dynamics and its coupling with the poroelastic substrate. Second, an exact formulation based on the multiple scattering theory is developed to extend the two-dimensional classical Lamb’s problem to the case of an elastic half-space coupled to an arbitrary number of discrete surface resonators. To this purpose, the incident wavefield generated by a harmonic source and the scattered field generated by each resonator are calculated. The substrate wavefield is then obtained as solutions of the coupled problem due to the interference of the incident field and the multiple scattered fields of the oscillators. Third, the above discussed formulation is extended to three-dimensional contexts. The purpose here is to investigate the dynamic behavior and the topological properties of quasiperiodic elastic metasurfaces. Finally, the multiple scattering formulation is extended to model flexural metasurfaces, i.e., an array of thin plates. To this end, the resonant plates are modeled by means of their equivalent impedance, derived by exploiting the Kirchhoff plate theory. The proposed formulation permits the treatment of a general flexural metasurface, with no limitation on the number of plates and the configuration taken into account. Overall, the proposed analytical tools could pave the way for a better understanding of metasurface dynamics and their implementation in engineered devices.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
The interest in using titanium to fabricate removable partial denture (RPD) frameworks has increased, but there are few studies evaluating the effects of casting methods on clasp behavior. OBJECTIVE: This study compared the occurrence of porosities and the retentive force of commercially pure titanium (CP Ti) and cobalt-chromium (Co-Cr) removable partial denture circumferential clasps cast by induction/centrifugation and plasma/vacuum-pressure. MATERIAL AND METHODS: 72 frameworks were cast from CP Ti (n=36) and Co-Cr alloy (n=36; control group). For each material, 18 frameworks were casted by electromagnetic induction and injected by centrifugation, whereas the other 18 were casted by plasma and injected by vacuum-pressure. For each casting method, three subgroups (n=6) were formed: 0.25 mm, 0.50 mm, and 0.75 mm undercuts. The specimens were radiographed and subjected to an insertion/removal test simulating 5 years of framework use. Data were analyzed by ANOVA and Tukey's to compare materials and cast methods (α=0.05). RESULTS: Three of 18 specimens of the induction/centrifugation group and 9 of 18 specimens of plasma/vacuum-pressure cast presented porosities, but only 1 and 7 specimens, respectively, were rejected for simulation test. For Co-Cr alloy, no defects were found. Comparing the casting methods, statistically significant differences (p<0.05) were observed only for the Co-Cr alloy with 0.25 mm and 0.50 mm undercuts. Significant differences were found for the 0.25 mm and 0.75 mm undercuts dependent on the material used. For the 0.50 mm undercut, significant differences were found when the materials were induction casted. CONCLUSION: Although both casting methods produced satisfactory CP Ti RPD frameworks, the occurrence of porosities was greater in the plasma/vacuum-pressure than in the induction/centrifugation method, the latter resulting in higher clasp rigidity, generating higher retention force values.
Resumo:
Shallow-water tropical reefs and the deep sea represent the two most diverse marine environments. Understanding the origin and diversification of this biodiversity is a major quest in ecology and evolution. The most prominent and well-supported explanation, articulated since the first explorations of the deep sea, holds that benthic marine fauna originated in shallow, onshore environments, and diversified into deeper waters. In contrast, evidence that groups of marine organisms originated in the deep sea is limited, and the possibility that deep-water taxa have contributed to the formation of shallow-water communities remains untested with phylogenetic methods. Here we show that stylasterid corals (Cnidaria: Hydrozoa: Stylasteridae)-the second most diverse group of hard corals-originated and diversified extensively in the deep sea, and subsequently invaded shallow waters. Our phylogenetic results show that deep-water stylasterid corals have invaded the shallow-water tropics three times, with one additional invasion of the shallow-water temperate zone. Our results also show that anti-predatory innovations arose in the deep sea, but were not involved in the shallow-water invasions. These findings are the first robust evidence that an important group of tropical shallow-water marine animals evolved from deep-water ancestors.
Resumo:
Purpose: To evaluate the expression of NF-kappa B pathway genes in total bone marrow samples obtained from MM at diagnosis using real-time quantitative PCR and to evaluate its possible correlation with disease clinical features and survival. Material and methods: Expression of eight genes related to NF-kappa B pathway (NFKB1, IKB, RANK, RANKL, OPG, IL6, VCAM1 and ICAM1) were studied in 53 bone marrow samples from newly diagnosed MM patients and in seven normal controls, using the Taqman system. Genes were considered overexpressed when tumor expression level was at least four times higher than that observed in normal samples. Results: The percentages of overexpression of the eight genes were: NFKB1 0%, IKB 22.6%, RANK 15.1%, RANKL 31.3%, OPG 7.5%, IL6 39.6%, VCAM1 10% and ICAM1 26%. We found association between IL6 expression level and International Staging System (ISS) (p = 0.01), meaning that MM patients with high ISS scores have more chance of overexpression of IL6. The mean value of ICAM1 relative expression was also associated with the ISS score (p = 0.02). Regarding OS, cases with IL6 overexpression present worse evolution than cases with IL6 normal expression (p = 0.04). Conclusion: We demonstrated that total bone marrow aspirates can be used as a source of material for gene expression studies in MM. In this context, we confirmed that IL6 overexpression was significantly associated with worse survival and we described that it is associated with high ISS scores. Also, ICAM1 was overexpressed in 26% of cases and its level was associated with ISS scores.