863 resultados para set based design
Resumo:
The formulation of a new process-based crop model, the general large-area model (GLAM) for annual crops is presented. The model has been designed to operate on spatial scales commensurate with those of global and regional climate models. It aims to simulate the impact of climate on crop yield. Procedures for model parameter determination and optimisation are described, and demonstrated for the prediction of groundnut (i.e. peanut; Arachis hypogaea L.) yields across India for the period 1966-1989. Optimal parameters (e.g. extinction coefficient, transpiration efficiency, rate of change of harvest index) were stable over space and time, provided the estimate of the yield technology trend was based on the full 24-year period. The model has two location-specific parameters, the planting date, and the yield gap parameter. The latter varies spatially and is determined by calibration. The optimal value varies slightly when different input data are used. The model was tested using a historical data set on a 2.5degrees x 2.5degrees grid to simulate yields. Three sites are examined in detail-grid cells from Gujarat in the west, Andhra Pradesh towards the south, and Uttar Pradesh in the north. Agreement between observed and modelled yield was variable, with correlation coefficients of 0.74, 0.42 and 0, respectively. Skill was highest where the climate signal was greatest, and correlations were comparable to or greater than correlations with seasonal mean rainfall. Yields from all 35 cells were aggregated to simulate all-India yield. The correlation coefficient between observed and simulated yields was 0.76, and the root mean square error was 8.4% of the mean yield. The model can be easily extended to any annual crop for the investigation of the impacts of climate variability (or change) on crop yield over large areas. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
There is a growing concern in reducing greenhouse gas emissions all over the world. The U.K. has set 34% target reduction of emission before 2020 and 80% before 2050 compared to 1990 recently in Post Copenhagen Report on Climate Change. In practise, Life Cycle Cost (LCC) and Life Cycle Assessment (LCA) tools have been introduced to construction industry in order to achieve this such as. However, there is clear a disconnection between costs and environmental impacts over the life cycle of a built asset when using these two tools. Besides, the changes in Information and Communication Technologies (ICTs) lead to a change in the way information is represented, in particular, information is being fed more easily and distributed more quickly to different stakeholders by the use of tool such as the Building Information Modelling (BIM), with little consideration on incorporating LCC and LCA and their maximised usage within the BIM environment. The aim of this paper is to propose the development of a model-based LCC and LCA tool in order to provide sustainable building design decisions for clients, architects and quantity surveyors, by then an optimal investment decision can be made by studying the trade-off between costs and environmental impacts. An application framework is also proposed finally as the future work that shows how the proposed model can be incorporated into the BIM environment in practise.
Resumo:
Norms are a set of rules that govern the behaviour of human agent, and how human agent behaves in response to the given certain conditions. This paper investigates the overlapping of information fields (set of shared norms) in the Context State Transition Model, and how these overlapping fields may affect the choices and actions of human agent. This paper also includes discussion on the implementation of new conflict resolution strategies based on the situation specification. The reasoning about conflicting norms in multiple information fields is discussed in detail.)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
This paper presents an improved design methodology for determining the parameters used in the classical Series-Parallel Loaded Resonant (SPLR) filter employed in the switching frequency controlled dimmable electronic ballasts. According to the analysis developed in this paper, it is possible to evaluate some characteristics of the resonant filter during the dimming process, such as: range of switching frequency, phase shift and rms value of the current drained by the resonant filter + fluorescent lamp set.
Resumo:
This paper presents an improved design methodology for the determination of the parameters used in the classical series-resonant parallel-loaded (SRPL) filter employed in the switching frequency controlled dimmable electronic ballasts. According to the analysis developed in this paper, it is possible to evaluate some important characteristics of the resonant filter during the dimming operation, such as: range of switching frequency, phase shift, and rms value of the current drained by the resonant filter + fluorescent lamp set. Experimental results are presented in order to validate the analyses developed in this paper. © 2005 IEEE.
Resumo:
Peroxisome-proliferator-activated receptors are a class of nuclear receptors with three subtypes: a, ? and d. Their main function is regulating gene transcription related to lipid and carbohydrate metabolism. Currently, there are no peroxisome-proliferator-activated receptors d drugs being marketed. In this work, we studied a data set of 70 compounds with a and d activity. Three partial least square models were created, and molecular docking studies were performed to understand the main reasons for peroxisome-proliferator-activated receptors d selectivity. The obtained results showed that some molecular descriptors (log P, hydration energy, steric and polar properties) are related to the main interactions that can direct ligands to a particular peroxisome-proliferator-activated receptors subtype.
Resumo:
Human African trypanosomiasis, also known as sleeping sickness, is a major cause of death in Africa, and for which there are no safe and effective treatments available. The enzyme aldolase from Trypanosoma brucei is an attractive, validated target for drug development. A series of alkyl‑glycolamido and alkyl-monoglycolate derivatives was studied employing a combination of drug design approaches. Three-dimensional quantitative structure-activity relationships (3D QSAR) models were generated using the comparative molecular field analysis (CoMFA). Significant results were obtained for the best QSAR model (r2 = 0.95, non-cross-validated correlation coefficient, and q2 = 0.80, cross-validated correlation coefficient), indicating its predictive ability for untested compounds. The model was then used to predict values of the dependent variables (pKi) of an external test set,the predicted values were in good agreement with the experimental results. The integration of 3D QSAR, molecular docking and molecular dynamics simulations provided further insight into the structural basis for selective inhibition of the target enzyme.
Resumo:
Percutaneous needle intervention based on PET/CT images is effective, but exposes the patient to unnecessary radiation due to the increased number of CT scans required. Computer assisted intervention can reduce the number of scans, but requires handling, matching and visualization of two different datasets. While one dataset is used for target definition according to metabolism, the other is used for instrument guidance according to anatomical structures. No navigation systems capable of handling such data and performing PET/CT image-based procedures while following clinically approved protocols for oncologic percutaneous interventions are available. The need for such systems is emphasized in scenarios where the target can be located in different types of tissue such as bone and soft tissue. These two tissues require different clinical protocols for puncturing and may therefore give rise to different problems during the navigated intervention. Studies comparing the performance of navigated needle interventions targeting lesions located in these two types of tissue are not often found in the literature. Hence, this paper presents an optical navigation system for percutaneous needle interventions based on PET/CT images. The system provides viewers for guiding the physician to the target with real-time visualization of PET/CT datasets, and is able to handle targets located in both bone and soft tissue. The navigation system and the required clinical workflow were designed taking into consideration clinical protocols and requirements, and the system is thus operable by a single person, even during transition to the sterile phase. Both the system and the workflow were evaluated in an initial set of experiments simulating 41 lesions (23 located in bone tissue and 18 in soft tissue) in swine cadavers. We also measured and decomposed the overall system error into distinct error sources, which allowed for the identification of particularities involved in the process as well as highlighting the differences between bone and soft tissue punctures. An overall average error of 4.23 mm and 3.07 mm for bone and soft tissue punctures, respectively, demonstrated the feasibility of using this system for such interventions. The proposed system workflow was shown to be effective in separating the preparation from the sterile phase, as well as in keeping the system manageable by a single operator. Among the distinct sources of error, the user error based on the system accuracy (defined as the distance from the planned target to the actual needle tip) appeared to be the most significant. Bone punctures showed higher user error, whereas soft tissue punctures showed higher tissue deformation error.
Resumo:
Space Based Solar Power satellites use solar arrays to generate clean, green, and renewable electricity in space and transmit it to earth via microwave, radiowave or laser beams to corresponding receivers (ground stations). These traditionally are large structures orbiting around earth at the geo-synchronous altitude. This thesis introduces a new architecture for a Space Based Solar Power satellite constellation. The proposed concept reduces the high cost involved in the construction of the space satellite and in the multiple launches to the geo-synchronous altitude. The proposed concept is a constellation of Low Earth Orbit satellites that are smaller in size than the conventional system. For this application a Repeated Sun-Synchronous Track Circular Orbit is considered (RSSTO). In these orbits, the spacecraft re-visits the same locations on earth periodically every given desired number of days with the line of nodes of the spacecraft’s orbit fixed relative to the Sun. A wide range of solutions are studied, and, in this thesis, a two-orbit constellation design is chosen and simulated. The number of satellites is chosen based on the electric power demands in a given set of global cities. The orbits of the satellites are designed such that their ground tracks visit a maximum number of ground stations during the revisit period. In the simulation, the locations of the ground stations are chosen close to big cities, in USA and worldwide, so that the space power constellation beams down power directly to locations of high electric power demands. The j2 perturbations are included in the mathematical model used in orbit design. The Coverage time of each spacecraft over a ground site and the gap time between two consecutive spacecrafts visiting a ground site are simulated in order to evaluate the coverage continuity of the proposed solar power constellation. It has been observed from simulations that there always periods in which s spacecraft does not communicate with any ground station. For this reason, it is suggested that each satellite in the constellation be equipped with power storage components so that it can store power for later transmission. This thesis presents a method for designing the solar power constellation orbits such that the number of ground stations visited during the given revisit period is maximized. This leads to maximizing the power transmission to ground stations.
Resumo:
La evaluación de las prestaciones de las embarcaciones a vela ha constituido un objetivo para ingenieros navales y marinos desde los principios de la historia de la navegación. El conocimiento acerca de estas prestaciones, ha crecido desde la identificación de los factores clave relacionados con ellas(eslora, estabilidad, desplazamiento y superficie vélica), a una comprensión más completa de las complejas fuerzas y acoplamientos involucrados en el equilibrio. Junto con este conocimiento, la aparición de los ordenadores ha hecho posible llevar a cabo estas tareas de una forma sistemática. Esto incluye el cálculo detallado de fuerzas, pero también, el uso de estas fuerzas junto con la descripción de una embarcación a vela para la predicción de su comportamiento y, finalmente, sus prestaciones. Esta investigación tiene como objetivo proporcionar una definición global y abierta de un conjunto de modelos y reglas para describir y analizar este comportamiento. Esto se lleva a cabo sin aplicar restricciones en cuanto al tipo de barco o cálculo, sino de una forma generalizada, de modo que sea posible resolver cualquier situación, tanto estacionaria como en el dominio del tiempo. Para ello se comienza con una definición básica de los factores que condicionan el comportamiento de una embarcación a vela. A continuación se proporciona una metodología para gestionar el uso de datos de diferentes orígenes para el cálculo de fuerzas, siempre con el la solución del problema como objetivo. Esta última parte se plasma en un programa de ordenador, PASim, cuyo propósito es evaluar las prestaciones de diferentes ti pos de embarcaciones a vela en un amplio rango de condiciones. Varios ejemplos presentan diferentes usos de PASim con el objetivo de ilustrar algunos de los aspectos discutidos a lo largo de la definición del problema y su solución . Finalmente, se presenta una estructura global de cara a proporcionar una representación virtual de la embarcación real, en la cual, no solo e l comportamiento sino también su manejo, son cercanos a la experiencia de los navegantes en el mundo real. Esta estructura global se propone como el núcleo (un motor de software) de un simulador físico para el que se proporciona una especificación básica. ABSTRACT The assessment of the performance of sailing yachts, and ships in general, has been an objective for naval architects and sailors since the beginning of the history of navigation. The knowledge has grown from identifying the key factors that influence performance(length, stability, displacement and sail area), to a much more complete understanding of the complex forces and couplings involved in the equilibrium. Along with this knowledge, the advent of computers has made it possible to perform the associated tasks in a systematic way. This includes the detailed calculation of forces, but also the use of those forces, along with the description of a sailing yacht, to predict its behavior, and ultimately, its performance. The aim of this investigation is to provide a global and open definition of a set of models and rules to describe and analyze the behavior of a sailing yacht. This is done without applying any restriction to the type of yacht or calculation, but rather in a generalized way, capable of solving any possible situation, whether it is in a steady state or in the time domain. First, the basic definition of the factors that condition the behavior of a sailing yacht is given. Then, a methodology is provided to assist with the use of data from different origins for the calculation of forces, always aiming towards the solution of the problem. This last part is implemented as a computational tool, PASim, intended to assess the performance of different types of sailing yachts in a wide range of conditions. Several examples then present different uses of PASim, as a way to illustrate some of the aspects discussed throughout the definition of the problem and its solution. Finally, a global structure is presented to provide a general virtual representation of the real yacht, in which not only the behavior, but also its handling is close to the experience of the sailors in the real world. This global structure is proposed as the core (a software engine) of a physical yacht simulator, for which a basic specification is provided.
Resumo:
A technique for systematic peptide variation by a combination of rational and evolutionary approaches is presented. The design scheme consists of five consecutive steps: (i) identification of a “seed peptide” with a desired activity, (ii) generation of variants selected from a physicochemical space around the seed peptide, (iii) synthesis and testing of this biased library, (iv) modeling of a quantitative sequence-activity relationship by an artificial neural network, and (v) de novo design by a computer-based evolutionary search in sequence space using the trained neural network as the fitness function. This strategy was successfully applied to the identification of novel peptides that fully prevent the positive chronotropic effect of anti-β1-adrenoreceptor autoantibodies from the serum of patients with dilated cardiomyopathy. The seed peptide, comprising 10 residues, was derived by epitope mapping from an extracellular loop of human β1-adrenoreceptor. A set of 90 peptides was synthesized and tested to provide training data for neural network development. De novo design revealed peptides with desired activities that do not match the seed peptide sequence. These results demonstrate that computer-based evolutionary searches can generate novel peptides with substantial biological activity.
Resumo:
Background Reliable information on causes of death is a fundamental component of health development strategies, yet globally only about one-third of countries have access to such information. For countries currently without adequate mortality reporting systems there are useful models other than resource-intensive population-wide medical certification. Sample-based mortality surveillance is one such approach. This paper provides methods for addressing appropriate sample size considerations in relation to mortality surveillance, with particular reference to situations in which prior information on mortality is lacking. Methods The feasibility of model-based approaches for predicting the expected mortality structure and cause composition is demonstrated for populations in which only limited empirical data is available. An algorithm approach is then provided to derive the minimum person-years of observation needed to generate robust estimates for the rarest cause of interest in three hypothetical populations, each representing different levels of health development. Results Modelled life expectancies at birth and cause of death structures were within expected ranges based on published estimates for countries at comparable levels of health development. Total person-years of observation required in each population could be more than halved by limiting the set of age, sex, and cause groups regarded as 'of interest'. Discussion The methods proposed are consistent with the philosophy of establishing priorities across broad clusters of causes for which the public health response implications are similar. The examples provided illustrate the options available when considering the design of mortality surveillance for population health monitoring purposes.
Resumo:
The present scarcity of operational knowledge-based systems (KBS) has been attributed, in part, to an inadequate consideration shown to user interface design during development. From a human factors perspective the problem has stemmed from an overall lack of user-centred design principles. Consequently the integration of human factors principles and techniques is seen as a necessary and important precursor to ensuring the implementation of KBS which are useful to, and usable by, the end-users for whom they are intended. Focussing upon KBS work taking place within commercial and industrial environments, this research set out to assess both the extent to which human factors support was presently being utilised within development, and the future path for human factors integration. The assessment consisted of interviews conducted with a number of commercial and industrial organisations involved in KBS development; and a set of three detailed case studies of individual KBS projects. Two of the studies were carried out within a collaborative Alvey project, involving the Interdisciplinary Higher Degrees Scheme (IHD) at the University of Aston in Birmingham, BIS Applied Systems Ltd (BIS), and the British Steel Corporation. This project, which had provided the initial basis and funding for the research, was concerned with the application of KBS to the design of commercial data processing (DP) systems. The third study stemmed from involvement on a KBS project being carried out by the Technology Division of the Trustees Saving Bank Group plc. The preliminary research highlighted poor human factors integration. In particular, there was a lack of early consideration of end-user requirements definition and user-centred evaluation. Instead concentration was given to the construction of the knowledge base and prototype evaluation with the expert(s). In response to this identified problem, a set of methods was developed that was aimed at encouraging developers to consider user interface requirements early on in a project. These methods were then applied in the two further projects, and their uptake within the overall development process was monitored. Experience from the two studies demonstrated that early consideration of user interface requirements was both feasible, and instructive for guiding future development work. In particular, it was shown a user interface prototype could be used as a basis for capturing requirements at the functional (task) level, and at the interface dialogue level. Extrapolating from this experience, a KBS life-cycle model is proposed which incorporates user interface design (and within that, user evaluation) as a largely parallel, rather than subsequent, activity to knowledge base construction. Further to this, there is a discussion of several key elements which can be seen as inhibiting the integration of human factors within KBS development. These elements stem from characteristics of present KBS development practice; from constraints within the commercial and industrial development environments; and from the state of existing human factors support.
Resumo:
2016 is the outbreak year of the virtual reality industry. In the field of virtual reality, 3D surveying plays an important role. Nowadays, 3D surveying technology has received increasing attention. This project aims to establish and optimize a WebGL three-dimensional broadcast platform combined with streaming media technology. It takes streaming media server and panoramic video broadcast in browser as the application background. Simultaneously, it discusses about the architecture from streaming media server to panoramic media player and analyzing relevant theory problem. This paper focuses on the debugging of streaming media platform, the structure of WebGL player environment, different types of ball model analysis, and the 3D mapping technology. The main work contains the following points: Initially, relay on Easy Darwin open source streaming media server, built a streaming service platform. It can realize the transmission from RTSP stream to streaming media server, and forwards HLS slice video to clients; Then, wrote a WebGL panoramic video player based on Three.js lib with JQuery browser playback controls. Set up a HTML5 panoramic video player; Next, analyzed the latitude and longitude sphere model which from Three.js library according to WebGL rendering method. Pointed out the drawbacks of this model and the breakthrough point of improvement; After that, on the basis of Schneider transform principle, established the Schneider sphere projection model, and converted the output OBJ file to JS file for media player reading. Finally implemented real time panoramic video high precision playing without plugin; At last, I summarized the whole project. Put forward the direction of future optimization and extensible market.