938 resultados para Product-specific model
Resumo:
Prostate specific antigen-a1-antichymotrypsin was detected by a double-enhancement strategy involving the exploitation of both colloidal gold nanoparticles (AuNPs) and precipitation of an insoluble product formed by HRP biocatalyzed oxidation. The AuNPs were synthesized and conjugated with horse-radish peroxidase-PSA polyclonal antibody by physisorption. Using the protein-colloid for SPR-based detection of the PSA/ACT complex showed their enhancement as being consistent with other previous studies with regard to AuNPs enhancement, while the enzyme precipitation using DAB substrate was applied for the first time and greatly amplified the signal. The limit of detection was found at as low as 0.027 ng/ml of the PSA/ACT complex (or 300 fM), which is much higher than that of previous reports. This study indicates another way to enhance SPR measurement, and it is generally applicable to other SPR-based immunoassays.
Resumo:
In this article the multibody simulation software package MADYMO for analysing and optimizing occupant safety design was used to model crash tests for Normal Containment barriers in accordance with EN 1317. The verification process was carried out by simulating a TB31 and a TB32 crash test performed on vertical portable concrete barriers and by comparing the numerical results to those obtained experimentally. The same modelling approach was applied to both tests to evaluate the predictive capacity of the modelling at two different impact speeds. A sensitivity analysis of the vehicle stiffness was also carried out. The capacity to predict all of the principal EN1317 criteria was assessed for the first time: the acceleration severity index, the theoretical head impact velocity, the barrier working width and the vehicle exit box. Results showed a maximum error of 6% for the acceleration severity index and 21% for theoretical head impact velocity for the numerical simulation in comparison to the recorded data. The exit box position was predicted with a maximum error of 4°. For the working width, a large percentage difference was observed for test TB31 due to the small absolute value of the barrier deflection but the results were well within the limit value from the standard for both tests. The sensitivity analysis showed the robustness of the modelling with respect to contact stiffness increase of ±20% and ±40%. This is the first multibody model of portable concrete barriers that can reproduce not only the acceleration severity index but all the test criteria of EN 1317 and is therefore a valuable tool for new product development and for injury biomechanics research.
Resumo:
Research on business model development has focused on the relationships between elements of value conceptualization and organization having a linear sequence in which business models are first designed and then implemented. Another stream of research points to business model development with these elements interacting in a cyclical manner. There is a need to improve our understanding of the connective mechanisms and dynamics involved in business model development, particularly from the challenging perspective of commercializing innovations. The aim of this paper was to explore business model development during the commercialization of innovations through a case-based qualitative study. This study found from four case studies that specific elements of business model development, representative of the conceptualization of value and organizing for value creation, integrate in a dynamic and cyclical process in the commercialization of technology innovations. The study provides empirical evidence that adds new insights to literature on sequential and more interactive processes of business model development. It also contributes to literature on business model development and particularly how it relates to the commercialization of innovations.
Resumo:
Clashes occur when components in an assembly unintentionally violate others. If clashes are not identified and designed out before manufacture, product function will be reduced or substantial cost will be incurred in rework. This paper introduces a novel approach for eliminating clashes by identifying which parameters defining the part features in a computer aided design (CAD) assembly need to change and by how much. Sensitivities are calculated for each parameter defining the part and the assembly as the change in clash volume due to a change in each parameter value. These sensitivities give an indication of important parameters and are used to predict the optimum combination of changes in each parameter to eliminate the clash. Consideration is given to the fact that it is sometimes preferable to modify some components in an assembly rather than others and that some components in an assembly cannot be modified as the designer does not have control over their shape. Successful elimination of clashes has been demonstrated in a number of example assemblies.
Resumo:
Rationale: Increasing epithelial repair and regeneration may hasten resolution of lung injury in patients with the Acute Respiratory Distress Syndrome (ARDS). In animal models of ARDS, Keratinocyte Growth Factor (KGF) reduces injury and increases epithelial proliferation and repair. The effect of KGF in the human alveolus is unknown.
Objectives: To test whether KGF can attenuate alveolar injury in a human model of ARDS.
Methods: Volunteers were randomized to intravenous KGF (60 μg/kg) or placebo for 3 days, before inhaling 50μg lipopolysaccharide. Six hours later, subjects underwent bronchoalveolar lavage (BAL) to quantify markers of alveolar inflammation and cell-specific injury.
Measurements and Main Results: KGF did not alter leukocyte infiltration or markers of permeability in response to LPS. KGF increased BAL concentrations of Surfactant Protein D (SP-D), MMP-9, IL-1Ra, GM-CSF and CRP. In vitro, BAL fluid from KGF-treated subjects (KGF BAL) inhibited pulmonary fibroblast proliferation, but increased alveolar epithelial proliferation. Active MMP-9 increased alveolar epithelial wound repair. Finally, BAL from the KGF pre-treated group enhanced macrophage phagocytic uptake of apoptotic epithelial cells and bacteria compared with BAL from the placebo-treated group. This effect was blocked by inhibiting activation of the GM-CSF receptor.
Conclusions: KGF treatment increases BAL SP-D, a marker of type II alveolar epithelial cell proliferation in a human model of ALI. Additionally KGF increases alveolar concentrations of the anti-inflammatory cytokine IL-1Ra, and mediators that drive epithelial repair (MMP-9) and enhance macrophage clearance of dead cells and bacteria (GM-CSF).
Resumo:
Background: A previously described economic model was based on average values for patients diagnosed with chronic periodontitis (CP). However, tooth loss varies among treated patients and factors for tooth loss include CP severity and risk. The model was refined to incorporate CP severity and risk to determine the cost of treating a specific level of CP severity and risk that is associated with the benefit of tooth preservation.
Methods: A population that received and another that did not receive periodontal treatment were used to determine treatment costs and tooth loss. The number of teeth preserved was the difference of the number of teeth lost between the two populations. The cost of periodontal treatment was divided by the number of teeth preserved for combinations of CP severity and risk.
Results: The cost of periodontal treatment divided by the number of teeth preserved ranged from (US) $ 1,405 to $ 4,895 for high or moderate risk combined with any severity of CP and was more than $ 8,639 for low risk combined with mild CP. The cost of a three-unit bridge was $ 3,416, and the cost of a single-tooth replacement was $ 4,787.
Conclusion: Periodontal treatment could be justified on the sole basis of tooth preservation when CP risk is moderate or high regardless of disease severity.
Resumo:
An intralaminar damage model (IDM), based on continuum damage mechanics, was developed for the simulation of composite structures subjected to damaging loads. This model can capture the complex intralaminar damage mechanisms, accounting for mode interactions, and delaminations. Its development is driven by a requirement for reliable crush simulations to design composite structures with a high specific energy absorption. This IDM was implemented as a user subroutine within the commercial finite element package, Abaqus/Explicit[1]. In this paper, the validation of the IDM is presented using two test cases. Firstly, the IDM is benchmarked against published data for a blunt notched specimen under uniaxial tensile loading, comparing the failure strength as well as showing the damage. Secondly, the crush response of a set of tulip-triggered composite cylinders was obtained experimentally. The crush loading and the associated energy of the specimen is compared with the FE model prediction. These test cases show that the developed IDM is able to capture the structural response with satisfactory accuracy
Resumo:
The present research investigates the uptake of phosphate ions from aqueous solutions using acidified laterite (ALS), a by-product from the production of ferric aluminium sulfate using laterite. Phosphate adsorption experiments were performed in batch systems to determine the amount of phosphate adsorbed as a function of solution pH, adsorbent dosage and thermodynamic parameters per fixed P concentration. Kinetic studies were also carried out to study the effect of adsorbent particle sizes. The maximum removal capacity of ALS observed at pH 5 was 3.68 mg P g-1. It was found that as the adsorbent dosage increases, the equilibrium pH decreases, so an adsorbent dosage of 1.0 g L-1 of ALS was selected. Adsorption capacity (qm) calculated from the Langmuir isotherm was found to be 2.73 mg g-1. Kinetic experimental data were mathematically well described using the pseudo first-order model over the full range of the adsorbent particle size. The adsorption reactions were endothermic, and the process of adsorption was favoured at high temperature; the ΔG and ΔH values implied that the main adsorption mechanism of P onto ALS is physisorption. The desorption studies indicated the need to consider a NaOH 0.1M solution as an optimal solution for practical regeneration applications.
Resumo:
Composite Applications on top of SAPs implementation of SOA (Enterprise SOA) enable the extension of already existing business logic. In this paper we show, based on a case study, how Model-Driven Engineering concepts are applied in the development of such Composite Applications. Our Case Study extends a back-end business process which is required for the specific needs of a demo company selling wine. We use this to describe how the business centric models specifying the modified business behaviour of our case study can be utilized for business performance analysis where most of the actions are performed by humans. In particular, we apply a refined version of Model-Driven Performance Engineering that we proposed in our previous work and motivate which business domain specifics have to be taken into account for business performance analysis. We additionally motivate the need for performance related decision support for domain experts, who generally lack performance related skills. Such a support should offer visual guidance about what should be changed in the design and resource mapping to get improved results with respect to modification constraints and performance objectives, or objectives for time.
Resumo:
This paper contributes a new approach for developing UML software designs from Natural Language (NL), making use of a meta-domain oriented ontology, well established software design principles and Natural Language Processing (NLP) tools. In the approach described here, banks of grammatical rules are used to assign event flows from essential use cases. A domain specific ontology is also constructed, permitting semantic mapping between the NL input and the modeled domain. Rules based on the widely-used General Responsibility Assignment Software Principles (GRASP) are then applied to derive behavioral models.
Resumo:
In this paper we investigate the first and second order characteristics of the received signal at the output ofhypothetical selection, equal gain and maximal ratio combiners which utilize spatially separated antennas at the basestation. Considering a range of human body movements, we model the model the small-scale fading characteristics ofthe signal using diversity specific analytical equations which take into account the number of available signal branchesat the receiver. It is shown that these equations provide an excellent fit to the measured channel data. Furthermore, formany hypothetical diversity receiver configurations, the Nakagami-m parameter was found to be close to 1.
Resumo:
Statistical downscaling (SD) methods have become a popular, low-cost and accessible means of bridging the gap between the coarse spatial resolution at which climate models output climate scenarios and the finer spatial scale at which impact modellers require these scenarios, with various different SD techniques used for a wide range of applications across the world. This paper compares the Generator for Point Climate Change (GPCC) model and the Statistical DownScaling Model (SDSM)—two contrasting SD methods—in terms of their ability to generate precipitation series under non-stationary conditions across ten contrasting global climates. The mean, maximum and a selection of distribution statistics as well as the cumulative frequencies of dry and wet spells for four different temporal resolutions were compared between the models and the observed series for a validation period. Results indicate that both methods can generate daily precipitation series that generally closely mirror observed series for a wide range of non-stationary climates. However, GPCC tends to overestimate higher precipitation amounts, whilst SDSM tends to underestimate these. This infers that GPCC is more likely to overestimate the effects of precipitation on a given impact sector, whilst SDSM is likely to underestimate the effects. GPCC performs better than SDSM in reproducing wet and dry day frequency, which is a key advantage for many impact sectors. Overall, the mixed performance of the two methods illustrates the importance of users performing a thorough validation in order to determine the influence of simulated precipitation on their chosen impact sector.
Resumo:
Viscosity represents a key indicator of product quality in polymer extrusion but has traditionally been difficult to measure in-process in real-time. An innovative, yet simple, solution to this problem is proposed by a Prediction-Feedback observer mechanism. A `Prediction' model based on the operating conditions generates an open-loop estimate of the melt viscosity; this estimate is used as an input to a second, `Feedback' model to predict the pressure of the system. The pressure value is compared to the actual measured melt pressure and the error used to correct the viscosity estimate. The Prediction model captures the relationship between the operating conditions and the resulting melt viscosity and as such describes the specific material behavior. The Feedback model on the other hand describes the fundamental physical relationship between viscosity and extruder pressure and is a function of the machine geometry. The resulting system yields viscosity estimates within 1% error, shows excellent disturbance rejection properties and can be directly applied to model-based control. This is of major significance to achieving higher quality and reducing waste and set-up times in the polymer extrusion industry.
Resumo:
Despite the popularity of the Theory of Planned Behaviour (TPB) a lack of research assessing the efficacy of the model in understanding the health behaviour of children exists, with those studies that have been conducted reporting problems with questionnaire formulation and low to moderate internal consistencies for TPB constructs. The aim of this study was to develop and test a TPB-based measure suitable for use with primary school children aged 9 to 10 years. A mixed method sequential design was employed. In Stage 1, 7 semi-structured focus group discussions (N=56) were conducted to elicit the underlying beliefs specific to tooth brushing. Using content thematic analysis the beliefs were identified and a TPB measure was developed. A repeated measures design was employed in Stage 2 using test re-test reliability analysis in order to assess its psychometric properties. In all, 184 children completed the questionnaire. Test-retest reliabilities support the validity and reliability of the TPB measure for assessing the tooth brushing beliefs of children. Pearson’s product moment correlations were calculated for all of the TPB beliefs, achieving substantial to almost perfect agreement levels. Specifically, a significant relationship between all 10 of the direct and indirect TPB constructs at the 0.01 level was achieved. This paper will discuss the design and development of the measure so could serve as a guide to fellow researchers and health psychologists interested in using theoretical models to investigate the health and well-being of children.
Resumo:
Diagnostic test sensitivity and specificity are probabilistic estimates with far reaching implications for disease control, management and genetic studies. In the absence of 'gold standard' tests, traditional Bayesian latent class models may be used to assess diagnostic test accuracies through the comparison of two or more tests performed on the same groups of individuals. The aim of this study was to extend such models to estimate diagnostic test parameters and true cohort-specific prevalence, using disease surveillance data. The traditional Hui-Walter latent class methodology was extended to allow for features seen in such data, including (i) unrecorded data (i.e. data for a second test available only on a subset of the sampled population) and (ii) cohort-specific sensitivities and specificities. The model was applied with and without the modelling of conditional dependence between tests. The utility of the extended model was demonstrated through application to bovine tuberculosis surveillance data from Northern and the Republic of Ireland. Simulation coupled with re-sampling techniques, demonstrated that the extended model has good predictive power to estimate the diagnostic parameters and true herd-level prevalence from surveillance data. Our methodology can aid in the interpretation of disease surveillance data, and the results can potentially refine disease control strategies.