958 resultados para Process Modeling
Resumo:
In this paper a modified algorithm is suggested for developing polynomial neural network (PNN) models. Optimal partial description (PD) modeling is introduced at each layer of the PNN expansion, a task accomplished using the orthogonal least squares (OLS) method. Based on the initial PD models determined by the polynomial order and the number of PD inputs, OLS selects the most significant regressor terms reducing the output error variance. The method produces PNN models exhibiting a high level of accuracy and superior generalization capabilities. Additionally, parsimonious models are obtained comprising a considerably smaller number of parameters compared to the ones generated by means of the conventional PNN algorithm. Three benchmark examples are elaborated, including modeling of the gas furnace process as well as the iris and wine classification problems. Extensive simulation results and comparison with other methods in the literature, demonstrate the effectiveness of the suggested modeling approach.
Resumo:
This paper introduces and evaluates DryMOD, a dynamic water balance model of the key hydrological process in drylands that is based on free, public-domain datasets. The rainfall model of DryMOD makes optimal use of spatially disaggregated Tropical Rainfall Measuring Mission (TRMM) datasets to simulate hourly rainfall intensities at a spatial resolution of 1-km. Regional-scale applications of the model in seasonal catchments in Tunisia and Senegal characterize runoff and soil moisture distribution and dynamics in response to varying rainfall data inputs and soil properties. The results highlight the need for hourly-based rainfall simulation and for correcting TRMM 3B42 rainfall intensities for the fractional cover of rainfall (FCR). Without FCR correction and disaggregation to 1 km, TRMM 3B42 based rainfall intensities are too low to generate surface runoff and to induce substantial changes to soil moisture storage. The outcomes from the sensitivity analysis show that topsoil porosity is the most important soil property for simulation of runoff and soil moisture. Thus, we demonstrate the benefit of hydrological investigations at a scale, for which reliable information on soil profile characteristics exists and which is sufficiently fine to account for the heterogeneities of these. Where such information is available, application of DryMOD can assist in the spatial and temporal planning of water harvesting according to runoff-generating areas and the runoff ratio, as well as in the optimization of agricultural activities based on realistic representation of soil moisture conditions.
Resumo:
Many communication signal processing applications involve modelling and inverting complex-valued (CV) Hammerstein systems. We develops a new CV B-spline neural network approach for efficient identification of the CV Hammerstein system and effective inversion of the estimated CV Hammerstein model. Specifically, the CV nonlinear static function in the Hammerstein system is represented using the tensor product from two univariate B-spline neural networks. An efficient alternating least squares estimation method is adopted for identifying the CV linear dynamic model’s coefficients and the CV B-spline neural network’s weights, which yields the closed-form solutions for both the linear dynamic model’s coefficients and the B-spline neural network’s weights, and this estimation process is guaranteed to converge very fast to a unique minimum solution. Furthermore, an accurate inversion of the CV Hammerstein system can readily be obtained using the estimated model. In particular, the inversion of the CV nonlinear static function in the Hammerstein system can be calculated effectively using a Gaussian-Newton algorithm, which naturally incorporates the efficient De Boor algorithm with both the B-spline curve and first order derivative recursions. The effectiveness of our approach is demonstrated using the application to equalisation of Hammerstein channels.
Resumo:
This article describes a case study involving information technology managers and their new programmer recruitment policy, but the primary interest is methodological. The processes of issue generation and selection and model conceptualization are described. Early use of “magnetic hexagons” allowed the generation of a range of issues, most of which would not have emerged if system dynamics elicitation techniques had been employed. With the selection of a specific issue, flow diagraming was used to conceptualize a model, computer implementation and scenario generation following naturally. Observations are made on the processes of system dynamics modeling, particularly on the need to employ general techniques of knowledge elicitation in the early stages of interventions. It is proposed that flexible approaches should be used to generate, select, and study the issues, since these reduce any biasing of the elicitation toward system dynamics problems and also allow the participants to take up the most appropriate problem- structuring approach.
Resumo:
The top managers of a biotechnology startup firm agreed to participate in a system dynamics modeling project to help them think about the firm's growth strategy. The article describes how the model was created and used to stimulate debate and discussion about growth management. The paper highlights several novel features about the process used for capturing management team knowledge. A heavy emphasis was placed on mapping the operating structure of the factory and distribution channels. Qualitative modeling methods (structural diagrams, descriptive variable names, and friendly algebra) were used to capture the management team's descriptions of the business. Simulation scenarios were crafted to stimulate debate about strategic issues such as capacity allocation, capacity expansion, customer recruitment, customer retention, and market growth, and to engage the management team in using the computer to design strategic scenarios. The article concludes with comments on the impact of the project.
Resumo:
It is well-known that social insects such as ants show interesting collective behaviors. How do they organize such behaviors? To expand understanding of collective behaviors of social insects, we focused on ants, Diacamma, and analyzed the behavior of a few individuals. In an experimental set-up, ants are placed in hemisphere without a nest and food and the trajectory of ants is recorded. From this bottom-up approach, we found following characteristics: 1. Activity of individuals increases and decreases periodically. 2. Spontaneous meeting process is observed between two ants and meeting spot of two ants is localized in the experimental field.
Resumo:
A study of the potential role of aerosols in modifying clouds and precipitation is presented using a numerical atmospheric model. Measurements of cloud condensation nuclei (CCN) and cloud size distribution properties taken in the southwestern Amazon region during the transition from dry to wet seasons were used as guidelines to define the microphysical parameters for the simulations. Numerical simulations were carried out using the Brazilian Development on Regional Atmospheric Modeling System, and the results presented considerable sensitivity to changes in these parameters. High CCN concentrations, typical of polluted days, were found to result in increases or decreases in total precipitation, depending on the level of pollution used as a reference, showing a complexity that parallels the aerosol-precipitation interaction. Our results show that on the grids evaluated, higher CCN concentrations reduced low-to-moderate rainfall rates and increased high rainfall rates. The principal consequence of the increased pollution was a change from a warm to a cold rain process, which affected the maximum and overall mean accumulated precipitation. Under polluted conditions, cloud cover diminished, allowing greater amounts of solar radiation to reach the surface. Aerosol absorption of radiation in the lower layers of the atmosphere delayed convective evolution but produced higher maximum rainfall rates due to increased instability. In addition, the intensity of the surface sensible heat flux, as well as that of the latent heat flux, was reduced by the lower temperature difference between surface and air, producing greater energy stores at the surface.
Resumo:
Nonsyndromic cleft lip and palate (NSCL/P) is a complex disease resulting from failure of fusion of facial primordia, a complex developmental process that includes the epithelial-mesenchymal transition (EMT). Detection of differential gene transcription between NSCL/P patients and control individuals offers an interesting alternative for investigating pathways involved in disease manifestation. Here we compared the transcriptome of 6 dental pulp stem cell (DPSC) cultures from NSCL/P patients and 6 controls. Eighty-seven differentially expressed genes (DEGs) were identified. The most significant putative gene network comprised 13 out of 87 DEGs of which 8 encode extracellular proteins: ACAN, COL4A1, COL4A2, GDF15, IGF2, MMP1, MMP3 and PDGFa. Through clustering analyses we also observed that MMP3, ACAN, COL4A1 and COL4A2 exhibit co-regulated expression. Interestingly, it is known that MMP3 cleavages a wide range of extracellular proteins, including the collagens IV, V, IX, X, proteoglycans, fibronectin and laminin. It is also capable of activating other MMPs. Moreover, MMP3 had previously been associated with NSCL/P. The same general pattern was observed in a further sample, confirming involvement of synchronized gene expression patterns which differed between NSCL/P patients and controls. These results show the robustness of our methodology for the detection of differentially expressed genes using the RankProd method. In conclusion, DPSCs from NSCL/P patients exhibit gene expression signatures involving genes associated with mechanisms of extracellular matrix modeling and palate EMT processes which differ from those observed in controls. This comparative approach should lead to a more rapid identification of gene networks predisposing to this complex malformation syndrome than conventional gene mapping technologies.
Resumo:
Human parasitic diseases are the foremost threat to human health and welfare around the world. Trypanosomiasis is a very serious infectious disease against which the currently available drugs are limited and not effective. Therefore, there is an urgent need for new chemotherapeutic agents. One attractive drug target is the major cysteine protease from Trypanosoma cruzi, cruzain. In the present work, comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) studies were conducted on a series of thiosemicarbazone and semicarbazone derivatives as inhibitors of cruzain. Molecular modeling studies were performed in order to identify the preferred binding mode of the inhibitors into the enzyme active site, and to generate structural alignments for the three-dimensional quantitative structure-activity relationship (3D QSAR) investigations. Statistically significant models were obtained (CoMFA. r(2) = 0.96 and q(2) = 0.78; CoMSIA, r(2) = 0.91 and q(2) = 0.73), indicating their predictive ability for untested compounds. The models were externally validated employing a test set, and the predicted values were in good agreement with the experimental results. The final QSAR models and the information gathered from the 3D CoMFA and CoMSIA contour maps provided important insights into the chemical and structural basis involved in the molecular recognition process of this family of cruzain inhibitors, and should be useful for the design of new structurally related analogs with improved potency. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
This work presents a Bayesian semiparametric approach for dealing with regression models where the covariate is measured with error. Given that (1) the error normality assumption is very restrictive, and (2) assuming a specific elliptical distribution for errors (Student-t for example), may be somewhat presumptuous; there is need for more flexible methods, in terms of assuming only symmetry of errors (admitting unknown kurtosis). In this sense, the main advantage of this extended Bayesian approach is the possibility of considering generalizations of the elliptical family of models by using Dirichlet process priors in dependent and independent situations. Conditional posterior distributions are implemented, allowing the use of Markov Chain Monte Carlo (MCMC), to generate the posterior distributions. An interesting result shown is that the Dirichlet process prior is not updated in the case of the dependent elliptical model. Furthermore, an analysis of a real data set is reported to illustrate the usefulness of our approach, in dealing with outliers. Finally, semiparametric proposed models and parametric normal model are compared, graphically with the posterior distribution density of the coefficients. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
This paper performs a thorough statistical examination of the time-series properties of the daily market volatility index (VIX) from the Chicago Board Options Exchange (CBOE). The motivation lies not only on the widespread consensus that the VIX is a barometer of the overall market sentiment as to what concerns investors' risk appetite, but also on the fact that there are many trading strategies that rely on the VIX index for hedging and speculative purposes. Preliminary analysis suggests that the VIX index displays long-range dependence. This is well in line with the strong empirical evidence in the literature supporting long memory in both options-implied and realized variances. We thus resort to both parametric and semiparametric heterogeneous autoregressive (HAR) processes for modeling and forecasting purposes. Our main ndings are as follows. First, we con rm the evidence in the literature that there is a negative relationship between the VIX index and the S&P 500 index return as well as a positive contemporaneous link with the volume of the S&P 500 index. Second, the term spread has a slightly negative long-run impact in the VIX index, when possible multicollinearity and endogeneity are controlled for. Finally, we cannot reject the linearity of the above relationships, neither in sample nor out of sample. As for the latter, we actually show that it is pretty hard to beat the pure HAR process because of the very persistent nature of the VIX index.
Resumo:
Corporate governance has been in the spotlight for the past two decades, being subject of numerous researches all over the world. Governance is pictured as a broad and diverse theme, evolving through different routes to form distinct systems. This scenario together with 2 types of agency problems (investor vs. management and minorities vs. controlling shareholders) produce different definitions for governance. Usually, studies investigate whether corporate governance structures influence firm performance, and company valuation. This approach implies investors can identify those impacts and later take them into consideration when making investment decisions. However, behavioral finance theory shows that not always investors take rational decisions, and therefore the modus operandi of those professionals needs to be understood. So, this research aimed to investigate to what extent Brazilian corporate governance standards and practices influence the investment decision-making process of equity markets' professionals from the sell-side and buy-side. This exploratory study was carried out through qualitative and quantitative approaches. In the qualitative phase, 8 practitioners were interviewed and 3 dimensions emerged: understanding, pertinence and practice. Based on the interviews’ findings, a questionnaire was formulated and distributed to buy-siders and sell-siders that cover Brazilian stocks. 117 respondents from all over the world contributed to the study. The data obtained were analyzed through structural equation modeling and descriptive statistics. The 3 dimensions became 5 constructs: definition (institutionalized governance, informal governance), pertinence (relevance), practice (valuation process, structured governance assessment) The results of this thesis suggest there is no definitive answer, as the extent to which governance will influence an investment decision process will depend on a number of circumstances which compose the context. The only certainty is the need to present a “corporate governance behavior”, rather than simply establishing rules and regulations at firm and country level.
Resumo:
Generalized hyper competitiveness in the world markets has determined the need to offer better products to potential and actual clients in order to mark an advantagefrom other competitors. To ensure the production of an adequate product, enterprises need to work on the efficiency and efficacy of their business processes (BPs) by means of the construction of Interactive Information Systems (IISs, including Interactive Multimedia Documents) so that they are processed more fluidly and correctly.The construction of the correct IIS is a major task that can only be successful if the needs from every intervenient are taken into account. Their requirements must bedefined with precision, extensively analyzed and consequently the system must be accurately designed in order to minimize implementation problems so that the IIS isproduced on schedule and with the fewer mistakes as possible. The main contribution of this thesis is the proposal of Goals, a software (engineering) construction process which aims at defining the tasks to be carried out in order to develop software. This process defines the stakeholders, the artifacts, and the techniques that should be applied to achieve correctness of the IIS. Complementarily, this process suggests two methodologies to be applied in the initial phases of the lifecycle of the Software Engineering process: Process Use Cases for the phase of requirements, and; MultiGoals for the phases of analysis and design. Process Use Cases is a UML-based (Unified Modeling Language), goal-driven and use case oriented methodology for the definition of functional requirements. It uses an information oriented strategy in order to identify BPs while constructing the enterprise’s information structure, and finalizes with the identification of use cases within the design of these BPs. This approach provides a useful tool for both activities of Business Process Management and Software Engineering. MultiGoals is a UML-based, use case-driven and architectural centric methodology for the analysis and design of IISs with support for Multimedia. It proposes the analysis of user tasks as the basis of the design of the: (i) user interface; (ii) the system behaviour that is modeled by means of patterns which can combine Multimedia and standard information, and; (iii) the database and media contents. This thesis makes the theoretic presentation of these approaches accompanied with examples from a real project which provide the necessary support for the understanding of the used techniques.
Resumo:
In a world where organizations are ever more complex the need for the knowledge of the organizational self is a growing necessity. The DEMO methodology sets a goal in achieving the specification of the organizational self capturing the essence of the organization in way independent of its implementation and also coherent, consistent, complete, modular and objective. But having such organization self notion is of little meaning if this notion is not shared by the organization actors. To achieve this goal in a society that has grown attached to technology and where time is of utmost importance, using a tool such as a semantic Wikipedia may be the perfect way of making the information accessible. However, to establish DEMO methodology in such platform there is a need to create bridges between its modeling components and semantic Wikipedia. It’s in that aspect that our thesis focuses, trying to establish and implement, using a study case, the principles of a way of transforming the DEMO methodology diagrams in comprehensive pages on semantic Wikipedia but keeping them as abstract as possible to allow expansibility and generalization to all diagrams without losing any valuable information so that, if that is the wish, those diagrams may be recreated from the semantic pages and make this process a full cycle.
Resumo:
Nowadays, more than half of the computer development projects fail to meet the final users' expectations. One of the main causes is insufficient knowledge about the organization of the enterprise to be supported by the respective information system. The DEMO methodology (Design and Engineering Methodology for Organizations) has been proved as a well-defined method to specify, through models and diagrams, the essence of any organization at a high level of abstraction. However, this methodology is platform implementation independent, lacking the possibility of saving and propagating possible changes from the organization models to the implemented software, in a runtime environment. The Universal Enterprise Adaptive Object Model (UEAOM) is a conceptual schema being used as a basis for a wiki system, to allow the modeling of any organization, independent of its implementation, as well as the previously mentioned change propagation in a runtime environment. Based on DEMO and UEAOM, this project aims to develop efficient and standardized methods, to enable an automatic conversion of DEMO Ontological Models, based on UEAOM specification into BPMN (Business Process Model and Notation) models of processes, using clear semantics, without ambiguities, in order to facilitate the creation of processes, almost ready for being executed on workflow systems that support BPMN.