862 resultados para Integrated user model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Matrix accumulation in the renal tubulointerstitium is predictive of a progressive decline in renal function. Transforming growth factor-beta(1) (TGF-beta(1)) and, more recently, connective tissue growth factor (CTGF) are recognized to play key roles in mediating the fibrogenic response, independently of the primary renal insult. Further definition of the independent and interrelated effects of CTGF and TGF-beta(1) is critical for the development of effective antifibrotic strategies. CTGF (20 ng/ml) induced fibronectin and collagen IV secretion in primary cultures of human proximal tubule cells (PTC) and cortical fibroblasts (CF) compared with control values (P < 0.005 in all cases). This effect was inhibited by neutralizing antibodies to either TGF-beta or to the TGF-beta type II receptor (TbetaRII). TGF-beta(1) induced a greater increase in fibronectin and collagen IV secretion in both PTC (P < 0.01) and CF (P < 0.01) compared with that observed with CTGF alone. The combination of TGF-beta(1) and CTGF was additive in their effects on both PTC and CF fibronectin and collagen IV secretion. TGF-beta(1) (2 ng/ml) stimulated CTGF mRNA expression within 30 min, which was sustained for up to 24 h, with a consequent increase in CTGF protein (P < 0.05), whereas CTGF had no effect on TGF-beta(1) mRNA or protein expression. TGF-beta(1) (2 ng/ml) induced phosphorylated (p)Smad-2 within 15 min, which was sustained for up to 24 h. CTGF had a delayed effect on increasing pSmad-2 expression, which was evident at 24 h. In conclusion, this study has demonstrated the key dependence of the fibrogenic actions of CTGF on TGF-beta. It has further uniquely demonstrated that CTGF requires TGF-beta, signaling through the TbetaRII in both PTCs and CFs, to exert its fibrogenic response in this in vitro model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The integrated chemical-biological degradation combining advanced oxidation by UV/H2O2 followed by aerobic biodegradation was used to degrade C.I. Reactive Azo Red 195A, commonly used in the textile industry in Australia. An experimental design based on the response surface method was applied to evaluate the interactive effects of influencing factors (UV irradiation time, initial hydrogen peroxide dosage and recirculation ratio of the system) on decolourisation efficiency and optimizing the operating conditions of the treatment process. The effects were determined by the measurement of dye concentration and soluble chemical oxygen demand (S-COD). The results showed that the dye and S-COD removal were affected by all factors individually and interactively. Maximal colour degradation performance was predicted, and experimentally validated, with no recirculation, 30 min UV irradiation and 500 mg H2O2/L. The model predictions for colour removal, based on a three-factor/five-level Box-Wilson central composite design and the response surface method analysis, were found to be very close to additional experimental results obtained under near optimal conditions. This demonstrates the benefits of this approach in achieving good predictions while minimising the number of experiments required. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional measures of termite food preference assess consequences of foraging behavior such as wood consumption, aggregation and/or termite survivorship. Although studies have been done to investigate the specifics of foraging behavior this is not generally integrated into choice assay experiments. Here choice assays were conducted with small isolated (orphaned) groups of workers and compared with choice assays involving foragers from whole nests (non-orphaned) in the laboratory. Aggregation to two different wood types was used as a measure of preference. Specific worker caste and instars participating in initial exploration were compared between assay methods, with samples of termites taken from nest carton material and sites where termites were feeding. Aggregation results differ between choice assay techniques. Castes and instars responsible for initial exploration, as determined in whole nest trials, were not commonly found exploring in isolated group trials, nor were they numerous in termites taken from active feeding sites. Consequently the use of small groups of M. turneri worker termites extracted from active feeding sites may not be appropriate for use in choice assays.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional vegetation mapping methods use high cost, labour-intensive aerial photography interpretation. This approach can be subjective and is limited by factors such as the extent of remnant vegetation, and the differing scale and quality of aerial photography over time. An alternative approach is proposed which integrates a data model, a statistical model and an ecological model using sophisticated Geographic Information Systems (GIS) techniques and rule-based systems to support fine-scale vegetation community modelling. This approach is based on a more realistic representation of vegetation patterns with transitional gradients from one vegetation community to another. Arbitrary, though often unrealistic, sharp boundaries can be imposed on the model by the application of statistical methods. This GIS-integrated multivariate approach is applied to the problem of vegetation mapping in the complex vegetation communities of the Innisfail Lowlands in the Wet Tropics bioregion of Northeastern Australia. The paper presents the full cycle of this vegetation modelling approach including sampling sites, variable selection, model selection, model implementation, internal model assessment, model prediction assessments, models integration of discrete vegetation community models to generate a composite pre-clearing vegetation map, independent data set model validation and model prediction's scale assessments. An accurate pre-clearing vegetation map of the Innisfail Lowlands was generated (0.83r(2)) through GIS integration of 28 separate statistical models. This modelling approach has good potential for wider application, including provision of. vital information for conservation planning and management; a scientific basis for rehabilitation of disturbed and cleared areas; a viable method for the production of adequate vegetation maps for conservation and forestry planning of poorly-studied areas. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Three important goals in describing software design patterns are: generality, precision, and understandability. To address these goals, this paper presents an integrated approach to specifying patterns using Object-Z and UML. To achieve the generality goal, we adopt a role-based metamodeling approach to define patterns. With this approach, each pattern is defined as a pattern role model. To achieve precision, we formalize role concepts using Object-Z (a role metamodel) and use these concepts to define patterns (pattern role models). To achieve understandability, we represent the role metamodel and pattern role models visually using UML. Our pattern role models provide a precise basis for pattern-based model transformations or refactoring approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The introduction of standard on-chip buses has eased integration and boosted the production of IP functional cores. However, once an IP is bus specific retargeting to a different bus is time-consuming and tedious, and this reduces the reusability of the bus-specific IP. As new bus standards are introduced and different interconnection methods are proposed, this problem increases. Many solutions have been proposed, however these solutions either limit the IP block performance or are restricted to a particular platform. A new concept is presented that can connect IP blocks to a wide variety of interface architectures with low overhead. This is achieved through the use a special interface adaptor logic layer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Collaborative recommendation is one of widely used recommendation systems, which recommend items to visitor on a basis of referring other's preference that is similar to current user. User profiling technique upon Web transaction data is able to capture such informative knowledge of user task or interest. With the discovered usage pattern information, it is likely to recommend Web users more preferred content or customize the Web presentation to visitors via collaborative recommendation. In addition, it is helpful to identify the underlying relationships among Web users, items as well as latent tasks during Web mining period. In this paper, we propose a Web recommendation framework based on user profiling technique. In this approach, we employ Probabilistic Latent Semantic Analysis (PLSA) to model the co-occurrence activities and develop a modified k-means clustering algorithm to build user profiles as the representatives of usage patterns. Moreover, the hidden task model is derived by characterizing the meaningful latent factor space. With the discovered user profiles, we then choose the most matched profile, which possesses the closely similar preference to current user and make collaborative recommendation based on the corresponding page weights appeared in the selected user profile. The preliminary experimental results performed on real world data sets show that the proposed approach is capable of making recommendation accurately and efficiently.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present a framework for pattern-based model evolution approaches in the MDA context. In the framework, users define patterns using a pattern modeling language that is designed to describe software design patterns, and they can use the patterns as rules to evolve their model. In the framework, design model evolution takes place via two steps. The first step is a binding process of selecting a pattern and defining where and how to apply the pattern in the model. The second step is an automatic model transformation that actually evolves the model according to the binding information and the pattern rule. The pattern modeling language is defined in terms of a MOF-based role metamodel, and implemented using an existing modeling framework, EMF, and incorporated as a plugin to the Eclipse modeling environment. The model evolution process is also implemented as an Eclipse plugin. With these two plugins, we provide an integrated framework where defining and validating patterns, and model evolution based on patterns can take place in a single modeling environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since the object management group (OMG) commenced its model driven architecture (MDA) initiative, there has been considerable activity proposing and building automatic model transformation systems to help implement the MDA concept. Much less attention has been given to the need to ensure that model transformations generate the intended results. This paper explores one aspect of validation and verification for MDA: coverage of the source and/or target metamodels by a set of model transformations. The paper defines the property of metamodel coverage and some corresponding algorithms. This property helps the user assess which parts of a source (or target) metamodel are referenced by a given model transformation set. Some results are presented from a prototype implementation that is built on the eclipse modeling framework (EMF).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

No Brasil pesquisadores vêm intensificando suas pesquisas na área da administração e administração pública, procurando rever os modelos antigos (patrimonialista, gerencialista e burocrático), evidenciando uma nova tendência de gestão participativa e a relação destes modelos com o grau de satisfação que os usuários de serviços públicos atribuem ao Estado. Conhecer esse grau de satisfação é um fator relevante na elaboração e implementação das políticas públicas. Neste contexto, este trabalho tem por objetivo avaliar o grau de satisfação do usuário de alguns serviços públicos na região do Grande ABCD. O universo da pesquisa limitase aos moradores dos municípios de Santo André, São Bernardo do Campo, São Caetano e Diadema. A pesquisa, de caráter descritivo, utilizou ferramentas estatísticas para comparar o grau de satisfação entre os municípios estudados. Para a coleta de dados foi aplicado um questionário adaptado de um instrumento de pesquisa utilizado pelo Governo Federal e composto por questões que foram desenvolvidas e agrupadas em quatro dimensões: Transporte, Saúde, Educação e Segurança Pública. Trabalhando com as médias foi possível observar que o município de São Caetano do Sul apresenta algumas diferenças (escores superiores no grau de satisfação) em relação aos demais municípios pesquisados, e também, que alguns desses municípios demonstram, em média, resultados próximos em algumas dimensões.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The data available during the drug discovery process is vast in amount and diverse in nature. To gain useful information from such data, an effective visualisation tool is required. To provide better visualisation facilities to the domain experts (screening scientist, biologist, chemist, etc.),we developed a software which is based on recently developed principled visualisation algorithms such as Generative Topographic Mapping (GTM) and Hierarchical Generative Topographic Mapping (HGTM). The software also supports conventional visualisation techniques such as Principal Component Analysis, NeuroScale, PhiVis, and Locally Linear Embedding (LLE). The software also provides global and local regression facilities . It supports regression algorithms such as Multilayer Perceptron (MLP), Radial Basis Functions network (RBF), Generalised Linear Models (GLM), Mixture of Experts (MoE), and newly developed Guided Mixture of Experts (GME). This user manual gives an overview of the purpose of the software tool, highlights some of the issues to be taken care while creating a new model, and provides information about how to install & use the tool. The user manual does not require the readers to have familiarity with the algorithms it implements. Basic computing skills are enough to operate the software.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis presents a two-dimensional Risk Assessment Method (RAM) where the assessment of risk to the groundwater resources incorporates both the quantification of the probability of the occurrence of contaminant source terms, as well as the assessment of the resultant impacts. The approach emphasizes the need for a greater dependency on the potential pollution sources, rather than the traditional approach where assessment is based mainly on the intrinsic geo-hydrologic parameters. The risk is calculated using Monte Carlo simulation methods whereby random pollution events were generated to the same distribution as historically occurring events or a priori potential probability distribution. Integrated mathematical models then simulate contaminant concentrations at the predefined monitoring points within the aquifer. The spatial and temporal distributions of the concentrations were calculated from repeated realisations, and the number of times when a user defined concentration magnitude was exceeded is quantified as a risk. The method was setup by integrating MODFLOW-2000, MT3DMS and a FORTRAN coded risk model, and automated, using a DOS batch processing file. GIS software was employed in producing the input files and for the presentation of the results. The functionalities of the method, as well as its sensitivities to the model grid sizes, contaminant loading rates, length of stress periods, and the historical frequencies of occurrence of pollution events were evaluated using hypothetical scenarios and a case study. Chloride-related pollution sources were compiled and used as indicative potential contaminant sources for the case study. At any active model cell, if a random generated number is less than the probability of pollution occurrence, then the risk model will generate synthetic contaminant source term as an input into the transport model. The results of the applications of the method are presented in the form of tables, graphs and spatial maps. Varying the model grid sizes indicates no significant effects on the simulated groundwater head. The simulated frequency of daily occurrence of pollution incidents is also independent of the model dimensions. However, the simulated total contaminant mass generated within the aquifer, and the associated volumetric numerical error appear to increase with the increasing grid sizes. Also, the migration of contaminant plume advances faster with the coarse grid sizes as compared to the finer grid sizes. The number of daily contaminant source terms generated and consequently the total mass of contaminant within the aquifer increases in a non linear proportion to the increasing frequency of occurrence of pollution events. The risk of pollution from a number of sources all occurring by chance together was evaluated, and quantitatively presented as risk maps. This capability to combine the risk to a groundwater feature from numerous potential sources of pollution proved to be a great asset to the method, and a large benefit over the contemporary risk and vulnerability methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

E-grocery is gradually becoming viable or a necessity for many families. Yet, most e-supermarkets are seen as providers of low value "staple" and bulky goods mainly. While each store has a large number of SKU available, these products are mainly necessity goods with low marginal value for hedonistic consumption. A need to acquire diverse products (e.g., organic), premium priced products (e.g., wine) for special occasions (e.g., anniversary, birthday), or products just for health related reasons (e.g., allergies, diabetes) are yet to be served via one-stop e-tailers. In this paper, we design a mathematical model that takes into account consumers' geo-demographics and multi-product sourcing capacity for creating critical mass and profit. Our mathematical model is a variant of Capacitated Vehicle Routing Problem with Time Windows (CVRPTW), which we extend by adding intermediate locations for trucks to meet and exchange goods. We illustrate our model for the city of Istanbul using GIS maps, and discuss its various extensions as well as managerial implications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose - The purpose of the paper is to develop an integrated quality management model, which identifies problems, suggests solutions, develops a framework for implementation and helps evaluate performance of health care services dynamically. Design/methodology/approach - This paper uses logical framework analysis (LFA), a matrix approach to project planning for managing quality. This has been applied to three acute healthcare services (Operating room utilization, Accident and emergency, and Intensive care) in order to demonstrate its effectiveness. Findings - The paper finds that LFA is an effective method of quality management of hospital-based healthcare services. Research limitations/implications - This paper shows LFA application in three service processes in one hospital. However, ideally this is required to be tested in several hospitals and other services as well. Practical implications - In the paper the proposed model can be practised in hospital-based healthcare services for improving performance. Originality/value - The paper shows that quality improvement in healthcare services is a complex and multi-dimensional task. Although various quality management tools are routinely deployed for identifying quality issues in health care delivery and corrective measures are taken for superior performance, there is an absence of an integrated approach, which can identify and analyze issues, provide solutions to resolve those issues, develop a project management framework (planning, monitoring, and evaluating) to implement those solutions in order to improve process performance. This study introduces an integrated and uniform quality management tool. It integrates operations with organizational strategies. © Emerald Group Publishing Limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper focuses on minimizing printed circuit board (PCB) assembly time for a chipshootermachine, which has a movable feeder carrier holding components, a movable X–Y table carrying a PCB, and a rotary turret with multiple assembly heads. The assembly time of the machine depends on two inter-related optimization problems: the component sequencing problem and the feeder arrangement problem. Nevertheless, they were often regarded as two individual problems and solved separately. This paper proposes two complete mathematical models for the integrated problem of the machine. The models are verified by two commercial packages. Finally, a hybrid genetic algorithm previously developed by the authors is presented to solve the model. The algorithm not only generates the optimal solutions quickly for small-sized problems, but also outperforms the genetic algorithms developed by other researchers in terms of total assembly time.