916 resultados para Applications of nanocomposites


Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper reviews Bayesian procedures for phase 1 dose-escalation studies and compares different dose schedules and cohort sizes. The methodology described is motivated by the situation of phase 1 dose-escalation studiesin oncology, that is, a single dose administered to each patient, with a single binary response ("toxicity"' or "no toxicity") observed. It is likely that a wider range of applications of the methodology is possible. In this paper, results from 10000-fold simulation runs conducted using the software package Bayesian ADEPT are presented. Four designs were compared under six scenarios. The simulation results indicate that there are slight advantages of having more dose levels and smaller cohort sizes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: The present paper investigates the question of a suitable basic model for the number of scrapie cases in a holding and applications of this knowledge to the estimation of scrapie-ffected holding population sizes and adequacy of control measures within holding. Is the number of scrapie cases proportional to the size of the holding in which case it should be incorporated into the parameter of the error distribution for the scrapie counts? Or, is there a different - potentially more complex - relationship between case count and holding size in which case the information about the size of the holding should be better incorporated as a covariate in the modeling? Methods: We show that this question can be appropriately addressed via a simple zero-truncated Poisson model in which the hypothesis of proportionality enters as a special offset-model. Model comparisons can be achieved by means of likelihood ratio testing. The procedure is illustrated by means of surveillance data on classical scrapie in Great Britain. Furthermore, the model with the best fit is used to estimate the size of the scrapie-affected holding population in Great Britain by means of two capture-recapture estimators: the Poisson estimator and the generalized Zelterman estimator. Results: No evidence could be found for the hypothesis of proportionality. In fact, there is some evidence that this relationship follows a curved line which increases for small holdings up to a maximum after which it declines again. Furthermore, it is pointed out how crucial the correct model choice is when applied to capture-recapture estimation on the basis of zero-truncated Poisson models as well as on the basis of the generalized Zelterman estimator. Estimators based on the proportionality model return very different and unreasonable estimates for the population sizes. Conclusion: Our results stress the importance of an adequate modelling approach to the association between holding size and the number of cases of classical scrapie within holding. Reporting artefacts and speculative biological effects are hypothesized as the underlying causes of the observed curved relationship. The lack of adjustment for these artefacts might well render ineffective the current strategies for the control of the disease.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This article is about modeling count data with zero truncation. A parametric count density family is considered. The truncated mixture of densities from this family is different from the mixture of truncated densities from the same family. Whereas the former model is more natural to formulate and to interpret, the latter model is theoretically easier to treat. It is shown that for any mixing distribution leading to a truncated mixture, a (usually different) mixing distribution can be found so. that the associated mixture of truncated densities equals the truncated mixture, and vice versa. This implies that the likelihood surfaces for both situations agree, and in this sense both models are equivalent. Zero-truncated count data models are used frequently in the capture-recapture setting to estimate population size, and it can be shown that the two Horvitz-Thompson estimators, associated with the two models, agree. In particular, it is possible to achieve strong results for mixtures of truncated Poisson densities, including reliable, global construction of the unique NPMLE (nonparametric maximum likelihood estimator) of the mixing distribution, implying a unique estimator for the population size. The benefit of these results lies in the fact that it is valid to work with the mixture of truncated count densities, which is less appealing for the practitioner but theoretically easier. Mixtures of truncated count densities form a convex linear model, for which a developed theory exists, including global maximum likelihood theory as well as algorithmic approaches. Once the problem has been solved in this class, it might readily be transformed back to the original problem by means of an explicitly given mapping. Applications of these ideas are given, particularly in the case of the truncated Poisson family.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The structural and reactive properties of the acetyl-protected "one-legged" manganese porphyrin [SAc]P-Mn(III)Cl on Ag(100) have been studied by NEXAFS, synchrotron XPS and STM Spontaneous surface-mediated deprotection occurs at 300 K accompanied by spreading of the resulting thio-tethered porphyrin across the metal surface Loss of the axial chlorine ligand occurs at 498 K, without any demetalation of the macrocycle, leaving the Mn center in a low co-ordination state At low coverages the macrocycle is markedly tilted toward the silver surface, as is the phenyl group that forms part of the tethering "leg". In the monolayer region a striking transition occurs whereby the molecule rolls over, preserving the tilt angle of the phenyl group, strongly increasing that of the macrocycle, decreasing the apparent height of the molecule and decreasing its footprint, thus enabling closer packing These findings are in marked contrast with those previously reported for the corresponding more rigidly bound four-legged porphyrin [Turner, M., Vaughan, O. P. H., Kyriakou, G., Watson, D. J., Scherer, L. J; Davidson, G J. E, Sanders, J. K. M.; Lambert, R. M J. Am. Chem Soc 2009, 131, 1910] suggesting that the physicochemical :)properties and potential applications of these versatile systems should be strongly dependent on the mode of tethering to the surface.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Over the last 25 years, the effects of fatty acids on the immune system have been characterized using in vitro, animal and human studies. Advances in fatty acid biochemistry and molecular techniques have recently suggested new mechanisms by which fatty acids could potentially modify immune responses, including modification of the organization of cellular lipids and interaction with nuclear receptors. Possibilities for the clinical applications of n-3 PUFA are now developing. The present review focuses on the hypothesis that the anti-inflammatory properties of n-3 PUFA in the arterial wall may contribute to the protective effects of n-3 PUFA in CVD, as suggested by epidemiological and secondary prevention studies. Studies are just beginning to show that dietary n-3 PUFA can be incorporated into plaque lipid in human subjects, where they may influence the morphology and stability of the atherosclerotic lesion.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Consumers increasingly demand convenience foods of the highest quality in terms of natural flavor and taste, and which are freedom additives and preservatives. This demand has triggered the need for the development of a number of nonthermal approaches to food processing, of which high-pressure technology has proven to be very valuable. A number of recent publications have demonstrated novel and diverse uses of this technology. Its novel features, which include destruction of microorganisms at room temperature or lower, have made the technology commerically attractive. Enzymes forming bacteria can be by the application of pressure-thermal combinations. This review aims to identify the opportunities and challenges associated with this technology. In addition to discussing the effects of high pressure on food components, this review covers the combined effects of high pressure processing with: gamma irradiation, alternating current, ultrasound, and carbon dioxide or anti-microbial treatment. Further, the applications of this technology in various sectors-fruits and vegetables, dairy and meat processing-have been dealt with extensively. The integration of high-pressure with other matured processing operations such as blanching, dehydration, osmotic dehydration, rehyrdration, frying, freezing/thawing and solid-liquid extraction has been shown to open up new processing options. The key challenges identified include: heat transfer problems and resulting non-uniformity in processing, obtaining reliable and reproducible data, for process validation, lack of detailed knowledge about the interaction between high pressure, and a number of food constituents, packaging and statutory issues.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The eMinerals project has established an integrated compute and data minigrid infrastructure together with a set of collaborative tools,. The infrastructure is designed to support molecular simulation scientists working together as a virtual organisation aiming to understand a number of strategic processes in environmental science. The eMinerals virtual organisation is now working towards applying this infrastructure to tackle a new generation of scientific problems. This paper describes the achievements of the eMinerals virtual organisation to date, and describes ongoing applications of the virtual organisation infrastructure.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The two major applications of microwave remote sensors are radiometer and radar. Because of its importance and the nature of the application, much research has been made on the various aspects of the radar. This paper will focus on the various aspects of the radiometer from a design point of view and the Low Noise Amplifier will be designed and implemented. The paper is based on a study in radio Frequency Communications engineering and understanding of electronic and RF circuits. Some research study about the radiometer and practical implementation of Low Noise Amplifier for Radiometer will be the main focus of this paper. Basically the paper is divided into two parts. In the first part some background study about the radiometer will be carried out and commonly used types of radiometer will be discussed. In the second part LNA for the radiometer will be designed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Space applications demand the need for building reliable systems. Autonomic computing defines such reliable systems as self-managing systems. The work reported in this paper combines agent based and swarm robotic approaches leading to swarm-array computing, a novel technique to achieve autonomy for distributed parallel computing systems. Two swarm-array computing approaches based on swarms of computational resources and swarms of tasks are explored. FPGA is considered as the computing system. The feasibility of the two proposed approaches that binds the computing system and the task together is simulated on the SeSAm multi-agent simulator.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Knowledge-elicitation is a common technique used to produce rules about the operation of a plant from the knowledge that is available from human expertise. Similarly, data-mining is becoming a popular technique to extract rules from the data available from the operation of a plant. In the work reported here knowledge was required to enable the supervisory control of an aluminium hot strip mill by the determination of mill set-points. A method was developed to fuse knowledge-elicitation and data-mining to incorporate the best aspects of each technique, whilst avoiding known problems. Utilisation of the knowledge was through an expert system, which determined schedules of set-points and provided information to human operators. The results show that the method proposed in this paper was effective in producing rules for the on-line control of a complex industrial process.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We provide a system identification framework for the analysis of THz-transient data. The subspace identification algorithm for both deterministic and stochastic systems is used to model the time-domain responses of structures under broadband excitation. Structures with additional time delays can be modelled within the state-space framework using additional state variables. We compare the numerical stability of the commonly used least-squares ARX models to that of the subspace N4SID algorithm by using examples of fourth-order and eighth-order systems under pulse and chirp excitation conditions. These models correspond to structures having two and four modes simultaneously propagating respectively. We show that chirp excitation combined with the subspace identification algorithm can provide a better identification of the underlying mode dynamics than the ARX model does as the complexity of the system increases. The use of an identified state-space model for mode demixing, upon transformation to a decoupled realization form is illustrated. Applications of state-space models and the N4SID algorithm to THz transient spectroscopy as well as to optical systems are highlighted.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In 1989, the computer programming language POP-11 is 21 years old. This book looks at the reasons behind its invention, and traces its rise from an experimental language to a major AI language, playing a major part in many innovating projects. There is a chapter on the inventor of the language, Robin Popplestone, and a discussion of the applications of POP-11 in a variety of areas. The efficiency of AI programming is covered, along with a comparison between POP-11 and other programming languages. The book concludes by reviewing the standardization of POP-11 into POP91.