991 resultados para Modeling levels
Resumo:
In this paper some mathematical programming models are exposed in order to set the number of services on a specified system of bus lines, which are intended to assist high demand levels which may arise because of the disruption of Rapid Transit services or during the celebration of massive events. By means of this model two types of basic magnitudes can be determined, basically: a) the number of bus units assigned to each line and b) the number of services that should be assigned to those units. In these models, passenger flow assignment to lines can be considered of the system optimum type, in the sense that the assignment of units and of services is carried out minimizing a linear combination of operation costs and total travel time of users. The models consider delays experienced by buses as a consequence of the get in/out of the passengers, queueing at stations and the delays that passengers experience waiting at the stations. For the case of a congested strategy based user optimal passenger assignment model with strict capacities on the bus lines, the use of the method of successive averages is shown.
Resumo:
In the last decades, neuropsychological theories tend to consider cognitive functions as a result of the whole brainwork and not as individual local areas of its cortex. Studies based on neuroimaging techniques have increased in the last years, promoting an exponential growth of the body of knowledge about relations between cognitive functions and brain structures [1]. However, so fast evolution make complicated to integrate them in verifiable theories and, even more, translated in to cognitive rehabilitation. The aim of this research work is to develop a cognitive process-modeling tool. The purpose of this system is, in the first term, to represent multidimensional data, from structural and functional connectivity, neuroimaging, data from lesion studies and derived data from clinical intervention [2][3]. This will allow to identify consolidated knowledge, hypothesis, experimental designs, new data from ongoing studies and emerging results from clinical interventions. In the second term, we pursuit to use Artificial Intelligence to assist in decision making allowing to advance towards evidence based and personalized treatments in cognitive rehabilitation. This work presents the knowledge base design of the knowledge representation tool. It is compound of two different taxonomies (structure and function) and a set of tags linking both taxonomies at different levels of structural and functional organization. The remainder of the abstract is organized as follows: Section 2 presents the web application used for gathering necessary information for generating the knowledge base, Section 3 describes knowledge base structure and finally Section 4 expounds reached conclusions.
Resumo:
A great challenge for future information technologies is building reliable systems on top of unreliable components. Parameters of modern and future technology devices are affected by severe levels of process variability and devices will degrade and even fail during the normal lifeDme of the chip due to aging mechanisms. These extreme levels of variability are caused by the high device miniaturizaDon and the random placement of individual atoms. Variability is considered a "red brick" by the InternaDonal Technology Roadmap for Semiconductors. The session is devoted to this topic presenDng research experiences from the Spanish Network on Variability called VARIABLES. In this session a talk entlited "Modeling sub-threshold slope and DIBL mismatch of sub-22nm FinFet" was presented.
Resumo:
Knowledge modeling tools are software tools that follow a modeling approach to help developers in building a knowledge-based system. The purpose of this article is to show the advantages of using this type of tools in the development of complex knowledge-based decision support systems. In order to do so, the article describes the development of a system called SAIDA in the domain of hydrology with the help of the KSM modeling tool. SAIDA operates on real-time receiving data recorded by sensors (rainfall, water levels, flows, etc.). It follows a multi-agent architecture to interpret the data, predict the future behavior and recommend control actions. The system includes an advanced knowledge based architecture with multiple symbolic representation. KSM was especially useful to design and implement the complex knowledge based architecture in an efficient way.
Resumo:
The efficiency of a Power Plant is affected by the distribution of the pulverized coal within the furnace. The coal, which is pulverized in the mills, is transported and distributed by the primary gas through the mill-ducts to the interior of the furnace. This is done with a double function: dry and enter the coal by different levels for optimizing the combustion in the sense that a complete combustion occurs with homogeneous heat fluxes to the walls. The mill-duct systems of a real Power Plant are very complex and they are not yet well understood. In particular, experimental data concerning the mass flows of coal to the different levels are very difficult to measure. CFD modeling can help to determine them. An Eulerian/Lagrangian approach is used due to the low solid–gas volume ratio.
Resumo:
Background: In recent years, Spain has implemented a number of air quality control measures that are expected to lead to a future reduction in fine particle concentrations and an ensuing positive impact on public health. Objectives: We aimed to assess the impact on mortality attributable to a reduction in fine particle levels in Spain in 2014 in relation to the estimated level for 2007. Methods: To estimate exposure, we constructed fine particle distribution models for Spain for 2007 (reference scenario) and 2014 (projected scenario) with a spatial resolution of 16x16 km2. In a second step, we used the concentration-response functions proposed by cohort studies carried out in Europe (European Study of Cohorts for Air Pollution Effects and Rome longitudinal cohort) and North America (American Cancer Society cohort, Harvard Six Cities study and Canadian national cohort) to calculate the number of attributable annual deaths corresponding to all causes, all non-accidental causes, ischemic heart disease and lung cancer among persons aged over 25 years (2005-2007 mortality rate data). We examined the effect of the Spanish demographic shift in our analysis using 2007 and 2012 population figures. Results: Our model suggested that there would be a mean overall reduction in fine particle levels of 1mg/m3 by 2014. Taking into account 2007 population data, between 8 and 15 all-cause deaths per 100,000 population could be postponed annually by the expected reduction in fine particle levels. For specific subgroups, estimates varied from 10 to 30 deaths for all non-accidental causes, from 1 to 5 for lung cancer, and from 2 to 6 for ischemic heart disease. The expected burden of preventable mortality would be even higher in the future due to the Spanish population growth. Taking into account the population older than 30 years in 2012, the absolute mortality impact estimate would increase approximately by 18%. Conclusions: Effective implementation of air quality measures in Spain, in a scenario with a short-term projection, would amount to an appreciable decline infine particle concentrations, and this, in turn, would lead to notable health-related benefits. Recent European cohort studies strengthen the evidence of an association between long-term exposure to fine particles and health effects, and could enhance the health impact quantification in Europe. Air quality models can contribute to improved assessment of air pollution health impact estimates, particularly in study areas without air pollution monitoring data.
3-D modeling of perimeter recombination in GaAs diodes and its influence on concentrator solar cells
Resumo:
This paper describes a complete modelling of the perimeter recombination of GaAs diodes which solves most unknowns and suppresses the limitations of previous models. Because of the three dimensional nature of the implemented model, it is able to simulate real devices. GaAs diodes on two epiwafers with different base doping levels, sizes and geometries, namely square and circular are manufactured. The validation of the model is achieved by fitting the experimental measurements of the dark IV curve of the manufactured GaAs diodes. A comprehensive 3-D description of the occurring phenomena affecting the perimeter recombination is supplied with the help of the model. Finally, the model is applied to concentrator GaAs solar cells to assess the impact of their doping level, size and geometry on the perimeter recombination.
Resumo:
The bryostatins are a unique family of emerging cancer chemotherapeutic candidates isolated from marine bryozoa. Although the biochemical basis for their therapeutic activity is not known, these macrolactones exhibit high affinities for protein kinase C (PKC) isozymes, compete for the phorbol ester binding site on PKC, and stimulate kinase activity in vitro and in vivo. Unlike the phorbol esters, they are not first-stage tumor promoters. The design, computer modeling, NMR solution structure, PKC binding, and functional assays of a unique class of synthetic bryostatin analogs are described. These analogs (7b, 7c, and 8) retain the putative recognition domain of the bryostatins but are simplified through deletions and modifications in the C4-C14 spacer domain. Computer modeling of an analog prototype (7a) indicates that it exists preferentially in two distinct conformational classes, one in close agreement with the crystal structure of bryostatin 1. The solution structure of synthetic analog 7c was determined by NMR spectroscopy and found to be very similar to the previously reported structures of bryostatins 1 and 10. Analogs 7b, 7c, and 8 bound strongly to PKC isozymes with Ki = 297, 3.4, and 8.3 nM, respectively. Control 7d, like the corresponding bryostatin derivative, exhibited weak PKC affinity, as did the derivative, 9, lacking the spacer domain. Like bryostatin, acetal 7c exhibited significant levels of in vitro growth inhibitory activity (1.8–170 ng/ml) against several human cancer cell lines, providing an important step toward the development of simplified, synthetically accessible analogs of the bryostatins.
Resumo:
We describe the time evolution of gene expression levels by using a time translational matrix to predict future expression levels of genes based on their expression levels at some initial time. We deduce the time translational matrix for previously published DNA microarray gene expression data sets by modeling them within a linear framework by using the characteristic modes obtained by singular value decomposition. The resulting time translation matrix provides a measure of the relationships among the modes and governs their time evolution. We show that a truncated matrix linking just a few modes is a good approximation of the full time translation matrix. This finding suggests that the number of essential connections among the genes is small.
Resumo:
A statistical modeling approach is proposed for use in searching large microarray data sets for genes that have a transcriptional response to a stimulus. The approach is unrestricted with respect to the timing, magnitude or duration of the response, or the overall abundance of the transcript. The statistical model makes an accommodation for systematic heterogeneity in expression levels. Corresponding data analyses provide gene-specific information, and the approach provides a means for evaluating the statistical significance of such information. To illustrate this strategy we have derived a model to depict the profile expected for a periodically transcribed gene and used it to look for budding yeast transcripts that adhere to this profile. Using objective criteria, this method identifies 81% of the known periodic transcripts and 1,088 genes, which show significant periodicity in at least one of the three data sets analyzed. However, only one-quarter of these genes show significant oscillations in at least two data sets and can be classified as periodic with high confidence. The method provides estimates of the mean activation and deactivation times, induced and basal expression levels, and statistical measures of the precision of these estimates for each periodic transcript.
Resumo:
Coupling of cerebral blood flow (CBF) and cerebral metabolic rate for oxygen (CMRO2) in physiologically activated brain states remains the subject of debates. Recently it was suggested that CBF is tightly coupled to oxidative metabolism in a nonlinear fashion. As part of this hypothesis, mathematical models of oxygen delivery to the brain have been described in which disproportionately large increases in CBF are necessary to sustain even small increases in CMRO2 during activation. We have explored the coupling of CBF and oxygen delivery by using two complementary methods. First, a more complex mathematical model was tested that differs from those recently described in that no assumptions were made regarding tissue oxygen level. Second, [15O] water CBF positron emission tomography (PET) studies in nine healthy subjects were conducted during states of visual activation and hypoxia to examine the relationship of CBF and oxygen delivery. In contrast to previous reports, our model showed adequate tissue levels of oxygen could be maintained without the need for increased CBF or oxygen delivery. Similarly, the PET studies demonstrated that the regional increase in CBF during visual activation was not affected by hypoxia. These findings strongly indicate that the increase in CBF associated with physiological activation is regulated by factors other than local requirements in oxygen.
Resumo:
Multibody System Dynamics has been responsible for revolutionizing Mechanical Engineering Design by using mathematical models to simulate and optimize the dynamic behavior of a wide range of mechanical systems. These mathematical models not only can provide valuable informations about a system that could otherwise be obtained only by experiments with prototypes, but also have been responsible for the development of many model-based control systems. This work represents a contribution for dynamic modeling of multibody mechanical systems by developing a novel recursive modular methodology that unifies the main contributions of several Classical Mechanics formalisms. The reason for proposing such a methodology is to motivate the implementation of computational routines for modeling complex multibody mechanical systems without being dependent on closed source software and, consequently, to contribute for the teaching of Multibody System Dynamics in undergraduate and graduate levels. All the theoretical developments are based on and motivated by a critical literature review, leading to a general matrix form of the dynamic equations of motion of a multibody mechanical system (that can be expressed in terms of any set of variables adopted for the description of motions performed by the system, even if such a set includes redundant variables) and to a general recursive methodology for obtaining mathematical models of complex systems given a set of equations describing the dynamics of each of its uncoupled subsystems and another set describing the constraints among these subsystems in the assembled system. This work also includes some discussions on the description of motion (using any possible set of motion variables and admitting any kind of constraint that can be expressed by an invariant), and on the conditions for solving forward and inverse dynamics problems given a mathematical model of a multibody system. Finally, some examples of computational packages based on the novel methodology, along with some case studies, are presented, highlighting the contributions that can be achieved by using the proposed methodology.
Resumo:
Unripe banana flour (UBF) production employs bananas not submitted to maturation process, is an interesting alternative to minimize the fruit loss reduction related to inappropriate handling or fast ripening. The UBF is considered as a functional ingredient improving glycemic and plasma insulin levels in blood, have also shown efficacy on the control of satiety, insulin resistance. The aim of this work was to study the drying process of unripe banana slabs (Musa cavendishii, Nanicão) developing a transient drying model through mathematical modeling with simultaneous moisture and heat transfer. The raw material characterization was performed and afterwards the drying process was conducted at 40 ºC, 50 ºC e 60 ºC, the product temperature was recorded using thermocouples, the air velocity inside the chamber was 4 m·s-1. With the experimental data was possible to validate the diffusion model based on the Fick\'s second law and Fourier. For this purpose, the sorption isotherms were measured and fitted to the GAB model estimating the equilibrium moisture content (Xe), 1.76 [g H2O/100g d.b.] at 60 ºC and 10 % of relative humidity (RH), the thermophysical properties (k, Cp, ?) were also measured to be used in the model. Five cases were contemplated: i) Constant thermophysical properties; ii) Variable properties; iii) Mass (hm), heat transfer (h) coefficient and effective diffusivity (De) estimation 134 W·m-2·K-1, 4.91x10-5 m-2·s-1 and 3.278?10-10 m·s-2 at 60 ºC, respectively; iv) Variable De, it presented a third order polynomial behavior as function of moisture content; v) The shrinkage had an effect on the mathematical model, especially in the 3 first hours of process, the thickness experienced a contraction of about (30.34 ± 1.29) % out of the initial thickness, finding two decreasing drying rate periods (DDR I and DDR II), 3.28x10-10 m·s-2 and 1.77x10-10 m·s-2, respectively. COMSOL Multiphysics simulations were possible to perform through the heat and mass transfer coefficient estimated by the mathematical modeling.
Resumo:
In this paper the model of an Innovative Monitoring Network involving properly connected nodes to develop an Information and Communication Technology (ICT) solution for preventive maintenance of historical centres from early warnings is proposed. It is well known that the protection of historical centres generally goes from a large-scale monitoring to a local one and it could be supported by a unique ICT solution. More in detail, the models of a virtually organized monitoring system could enable the implementation of automated analyses by presenting various alert levels. An adequate ICT solution tool would allow to define a monitoring network for a shared processing of data and results. Thus, a possible retrofit solution could be planned for pilot cases shared among the nodes of the network on the basis of a suitable procedure utilizing a retrofit catalogue. The final objective would consist in providing a model of an innovative tool to identify hazards, damages and possible retrofit solutions for historical centres, assuring an easy early warning support for stakeholders. The action could proactively target the needs and requirements of users, such as decision makers responsible for damage mitigation and safeguarding of cultural heritage assets.
Resumo:
The microfoundations research agenda presents an expanded theoretical perspective because it considers individuals, their characteristics, and their interactions as relevant variables to help us understand firm-level strategic issues. However, microfoundations empirical research faces unique challenges because processes take place at different levels of analysis and these multilevel processes must be considered simultaneously. We describe multilevel modeling and mixed methods as methodological approaches whose use will allow for theoretical advancements. We describe key issues regarding the use of these two types of methods and, more importantly, discuss pressing substantive questions and topics that can be addressed with each of these methodological approaches with the goal of making theoretical advancements regarding the microfoundations research agenda and strategic management studies in general.