862 resultados para Integrated user model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A model is developed to represent the activity of a farm using the method of linear programming. Two are the main components of the model, the balance of soil fertility and the livestock nutrition. According to the first, the farm is supposed to have a total requirement of nitrogen, which is to be accomplished either through internal sources (manure) or through external sources (fertilisers). The second component describes the animal husbandry as having a nutritional requirement which must be satisfied through the internal production of arable crops or the acquisition of feed from the market. The farmer is supposed to maximise total net income from the agricultural and the zoo-technical activities by choosing one rotation among those available for climate and acclivity. The perspective of the analysis is one of a short period: the structure of the farm is supposed to be fixed without possibility to change the allocation of permanent crops and the amount of animal husbandry. The model is integrated with an environmental module that describes the role of the farm within the carbon-nitrogen cycle. On the one hand the farm allows storing carbon through the photosynthesis of the plants and the accumulation of carbon in the soil; on the other some activities of the farm emit greenhouse gases into the atmosphere. The model is tested for some representative farms of the Emilia-Romagna region, showing to be capable to give different results for conventional and organic farming and providing first results concerning the different atmospheric impact. Relevant data about the representative farms and the feasible rotations are extracted from the FADN database, with an integration of the coefficients from the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A two-dimensional model to analyze the distribution of magnetic fields in the airgap of a PM electrical machines is studied. A numerical algorithm for non-linear magnetic analysis of multiphase surface-mounted PM machines with semi-closed slots is developed, based on the equivalent magnetic circuit method. By using a modular structure geometry, whose the basic element can be duplicated, it allows to design whatever typology of windings distribution. In comparison to a FEA, permits a reduction in computing time and to directly changing the values of the parameters in a user interface, without re-designing the model. Output torque and radial forces acting on the moving part of the machine can be calculated. In addition, an analytical model for radial forces calculation in multiphase bearingless Surface-Mounted Permanent Magnet Synchronous Motors (SPMSM) is presented. It allows to predict amplitude and direction of the force, depending on the values of torque current, of levitation current and of rotor position. It is based on the space vectors method, letting the analysis of the machine also during transients. The calculations are conducted by developing the analytical functions in Fourier series, taking all the possible interactions between stator and rotor mmf harmonic components into account and allowing to analyze the effects of electrical and geometrical quantities of the machine, being parametrized. The model is implemented in the design of a control system for bearingless machines, as an accurate electromagnetic model integrated in a three-dimensional mechanical model, where one end of the motor shaft is constrained to simulate the presence of a mechanical bearing, while the other is free, only supported by the radial forces developed in the interactions between magnetic fields, to realize a bearingless system with three degrees of freedom. The complete model represents the design of the experimental system to be realized in the laboratory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tesi con lo scopo di analizzare il comportamento degli utenti in relazione all’Advertising Online e di costruire un modello che ne approssimi il funzionamento, identificando i fattori che ne influenzano l’efficacia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Self-organising pervasive ecosystems of devices are set to become a major vehicle for delivering infrastructure and end-user services. The inherent complexity of such systems poses new challenges to those who want to dominate it by applying the principles of engineering. The recent growth in number and distribution of devices with decent computational and communicational abilities, that suddenly accelerated with the massive diffusion of smartphones and tablets, is delivering a world with a much higher density of devices in space. Also, communication technologies seem to be focussing on short-range device-to-device (P2P) interactions, with technologies such as Bluetooth and Near-Field Communication gaining greater adoption. Locality and situatedness become key to providing the best possible experience to users, and the classic model of a centralised, enormously powerful server gathering and processing data becomes less and less efficient with device density. Accomplishing complex global tasks without a centralised controller responsible of aggregating data, however, is a challenging task. In particular, there is a local-to-global issue that makes the application of engineering principles challenging at least: designing device-local programs that, through interaction, guarantee a certain global service level. In this thesis, we first analyse the state of the art in coordination systems, then motivate the work by describing the main issues of pre-existing tools and practices and identifying the improvements that would benefit the design of such complex software ecosystems. The contribution can be divided in three main branches. First, we introduce a novel simulation toolchain for pervasive ecosystems, designed for allowing good expressiveness still retaining high performance. Second, we leverage existing coordination models and patterns in order to create new spatial structures. Third, we introduce a novel language, based on the existing ``Field Calculus'' and integrated with the aforementioned toolchain, designed to be usable for practical aggregate programming.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Until few years ago, 3D modelling was a topic confined into a professional environment. Nowadays technological innovations, the 3D printer among all, have attracted novice users to this application field. This sudden breakthrough was not supported by adequate software solutions. The 3D editing tools currently available do not assist the non-expert user during the various stages of generation, interaction and manipulation of 3D virtual models. This is mainly due to the current paradigm that is largely supported by two-dimensional input/output devices and strongly affected by obvious geometrical constraints. We have identified three main phases that characterize the creation and management of 3D virtual models. We investigated these directions evaluating and simplifying the classic editing techniques in order to propose more natural and intuitive tools in a pure 3D modelling environment. In particular, we focused on freehand sketch-based modelling to create 3D virtual models, interaction and navigation in a 3D modelling environment and advanced editing tools for free-form deformation and objects composition. To pursuing these goals we wondered how new gesture-based interaction technologies can be successfully employed in a 3D modelling environments, how we could improve the depth perception and the interaction in 3D environments and which operations could be developed to simplify the classical virtual models editing paradigm. Our main aims were to propose a set of solutions with which a common user can realize an idea in a 3D virtual model, drawing in the air just as he would on paper. Moreover, we tried to use gestures and mid-air movements to explore and interact in 3D virtual environment, and we studied simple and effective 3D form transformations. The work was carried out adopting the discrete representation of the models, thanks to its intuitiveness, but especially because it is full of open challenges.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Efficient coupling of light to quantum emitters, such as atoms, molecules or quantum dots, is one of the great challenges in current research. The interaction can be strongly enhanced by coupling the emitter to the eva-nescent field of subwavelength dielectric waveguides that offer strong lateral confinement of the guided light. In this context subwavelength diameter optical nanofibers as part of a tapered optical fiber (TOF) have proven to be powerful tool which also provide an efficient transfer of the light from the interaction region to an optical bus, that is to say, from the nanofiber to an optical fiber. rnAnother approach towards enhancing light–matter interaction is to employ an optical resonator in which the light is circulating and thus passes the emitters many times. Here, both approaches are combined by experi-mentally realizing a microresonator with an integrated nanofiber waist. This is achieved by building a fiber-integrated Fabry-Pérot type resonator from two fiber Bragg grating mirrors with a stop-band near the cesium D2-line wavelength. The characteristics of this resonator fulfill the requirements of nonlinear optics, optical sensing, and cavity quantum electrodynamics in the strong-coupling regime. Together with its advantageous features, such as a constant high coupling strength over a large volume, tunability, high transmission outside the mirror stop band, and a monolithic design, this resonator is a promising tool for experiments with nanofiber-coupled atomic ensembles in the strong-coupling regime. rnThe resonator's high sensitivity to the optical properties of the nanofiber provides a probe for changes of phys-ical parameters that affect the guided optical mode, e.g., the temperature via the thermo-optic effect of silica. Utilizing this detection scheme, the thermalization dynamics due to far-field heat radiation of a nanofiber is studied over a large temperature range. This investigation provides, for the first time, a measurement of the total radiated power of an object with a diameter smaller than all absorption lengths in the thermal spectrum at the level of a single object of deterministic shape and material. The results show excellent agreement with an ab initio thermodynamic model that considers heat radiation as a volumetric effect and that takes the emitter shape and size relative to the emission wavelength into account. Modeling and investigating the thermalization of microscopic objects with arbitrary shape from first principles is of fundamental interest and has important applications, such as heat management in nano-devices or radiative forcing of aerosols in Earth's climate system. rnUsing a similar method, the effect of the TOF's mechanical modes on the polarization and phase of the fiber-guided light is studied. The measurement results show that in typical TOFs these quantities exhibit high-frequency thermal fluctuations. They originate from high-Q torsional oscillations that couple to the nanofiber-guided light via the strain-optic effect. An ab-initio opto-mechanical model of the TOF is developed that provides an accurate quantitative prediction for the mode spectrum and the mechanically induced polarization and phase fluctuations. These high-frequency fluctuations may limit the ultimate ideality of fiber-coupling into photonic structures. Furthermore, first estimations show that they may currently limit the storage time of nanofiber-based atom traps. The model, on the other hand, provides a method to design TOFs with tailored mechanical properties in order to meet experimental requirements. rn

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analyzing and modeling relationships between the structure of chemical compounds, their physico-chemical properties, and biological or toxic effects in chemical datasets is a challenging task for scientific researchers in the field of cheminformatics. Therefore, (Q)SAR model validation is essential to ensure future model predictivity on unseen compounds. Proper validation is also one of the requirements of regulatory authorities in order to approve its use in real-world scenarios as an alternative testing method. However, at the same time, the question of how to validate a (Q)SAR model is still under discussion. In this work, we empirically compare a k-fold cross-validation with external test set validation. The introduced workflow allows to apply the built and validated models to large amounts of unseen data, and to compare the performance of the different validation approaches. Our experimental results indicate that cross-validation produces (Q)SAR models with higher predictivity than external test set validation and reduces the variance of the results. Statistical validation is important to evaluate the performance of (Q)SAR models, but does not support the user in better understanding the properties of the model or the underlying correlations. We present the 3D molecular viewer CheS-Mapper (Chemical Space Mapper) that arranges compounds in 3D space, such that their spatial proximity reflects their similarity. The user can indirectly determine similarity, by selecting which features to employ in the process. The tool can use and calculate different kinds of features, like structural fragments as well as quantitative chemical descriptors. Comprehensive functionalities including clustering, alignment of compounds according to their 3D structure, and feature highlighting aid the chemist to better understand patterns and regularities and relate the observations to established scientific knowledge. Even though visualization tools for analyzing (Q)SAR information in small molecule datasets exist, integrated visualization methods that allows for the investigation of model validation results are still lacking. We propose visual validation, as an approach for the graphical inspection of (Q)SAR model validation results. New functionalities in CheS-Mapper 2.0 facilitate the analysis of (Q)SAR information and allow the visual validation of (Q)SAR models. The tool enables the comparison of model predictions to the actual activity in feature space. Our approach reveals if the endpoint is modeled too specific or too generic and highlights common properties of misclassified compounds. Moreover, the researcher can use CheS-Mapper to inspect how the (Q)SAR model predicts activity cliffs. The CheS-Mapper software is freely available at http://ches-mapper.org.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last years, the European countries have paid increasing attention to renewable sources and greenhouse emissions. The Council of the European Union and the European Parliament have established ambitious targets for the next years. In this scenario, biomass plays a prominent role since its life cycle produces a zero net carbon dioxide emission. Additionally, biomass can ensure plant operation continuity thanks to its availability and storage ability. Several conventional systems running on biomass are available at the moment. Most of them are performant either in the large-scale or in the small power range. The absence of an efficient system on the small-middle scale inspired this thesis project. The object is an innovative plant based on a wet indirectly fired gas turbine (WIFGT) integrated with an organic Rankine cycle (ORC) unit for combined heat and power production. The WIFGT is a performant system in the small-middle power range; the ORC cycle is capable of giving value to low-temperature heat sources. Their integration is investigated in this thesis with the aim of carrying out a preliminary design of the components. The targeted plant output is around 200 kW in order not to need a wide cultivation area and to avoid biomass shipping. Existing in-house simulation tools are used: They are adapted to this purpose. Firstly the WIFGT + ORC model is built; Zero-dimensional models of heat exchangers, compressor, turbines, furnace, dryer and pump are used. Different fluids are selected but toluene and benzene turn out to be the most suitable. In the indirectly fired gas turbine a pressure ratio around 4 leads to the highest efficiency. From the thermodynamic analysis the system shows an electric efficiency of 38%, outdoing other conventional plants in the same power range. The combined plant is designed to recover thermal energy: Water is used as coolant in the condenser. It is heated from 60°C up to 90°C, ensuring the possibility of space heating. Mono-dimensional models are used to design the heat exchange equipment. Different types of heat exchangers are chosen depending on the working temperature. A finned-plate heat exchanger is selected for the WIFGT heat transfer equipment due to the high temperature, oxidizing and corrosive environment. A once-through boiler with finned tubes is chosen to vaporize the organic fluid in the ORC. A plate heat exchanger is chosen for the condenser and recuperator. A quasi-monodimensional model for single-stage axial turbine is implemented to design both the WIFGT and the ORC turbine. The system simulation after the components design shows an electric efficiency around 34% with a decrease by 10% compared to the zero-dimensional analysis. The work exhibits the system potentiality compared to the existing plants from both technical and economic point of view.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analog filters and direct digital filters are implemented using digital signal processing techniques. Specifically, Butterworth, Elliptic, and Chebyshev filters are implemented using the Motorola 56001 Digital Signal Processor by the integration of three software packages: MATLAB, C++, and Motorola's Application Development System. The integrated environment allows the novice user to design a filter automatically by specifying the filter order and critical frequencies, while permitting more experienced designers to take advantage of MATLAB's advanced design capabilities. This project bridges the gap between the theoretical results produced by MATLAB and the practicalities of implementing digital filters using the Motorola 56001 Digital Signal Processor. While these results are specific to the Motorola 56001 they may be extended to other digital signal processors. MATLAB handles the filter calculations, a C++ routine handles the conversion to assembly code, and the Motorola software compiles and transmits the code to the processor

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Simulation Automation Framework for Experiments (SAFE) is a project created to raise the level of abstraction in network simulation tools and thereby address issues that undermine credibility. SAFE incorporates best practices in network simulationto automate the experimental process and to guide users in the development of sound scientific studies using the popular ns-3 network simulator. My contributions to the SAFE project: the design of two XML-based languages called NEDL (ns-3 Experiment Description Language) and NSTL (ns-3 Script Templating Language), which facilitate the description of experiments and network simulationmodels, respectively. The languages provide a foundation for the construction of better interfaces between the user and the ns-3 simulator. They also provide input to a mechanism which automates the execution of network simulation experiments. Additionally,this thesis demonstrates that one can develop tools to generate ns-3 scripts in Python or C++ automatically from NSTL model descriptions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the current pilot study was to compare two strategies in the application of the cognitive differentiation program of Integrated Psychological Therapy for people with schizophrenia. Twenty-six outpatients were randomly assigned to the application of the program in group sessions (CDg), or to its application in individualized sessions (CDi). The program provides cognitive exercises to promote better performance in cognition, and both groups of participants completed the same number of exercises following the same number of sessions per week. Outcomes were assessed on neuropsychological measures of attention, executive functioning and everyday memory, and everyday functioning. Effect sizes showed the absence of effects in everyday memory and social functioning, higher improvements in the CDi group in attention, and a higher improvement in the CDg condition in executive functioning. The results suggest that the program application model could be individualized, depending on patient-specific cognitive deficits.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A search for a heavy standard model Higgs boson decaying via H→ZZ→→ℓ(+)ℓ(-)νν, where ℓ=e, μ, is presented. It is based on proton-proton collision data at √s=7 TeV, collected by the ATLAS experiment at the LHC in the first half of 2011 and corresponding to an integrated luminosity of 1.04 fb(-1). The data are compared to the expected standard model backgrounds. The data and the background expectations are found to be in agreement and upper limits are placed on the Higgs boson production cross section over the entire mass window considered; in particular, the production of a standard model Higgs boson is excluded in the region 340

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To compare four different implantation modalities for the repair of superficial osteochondral defects in a caprine model using autologous, scaffold-free, engineered cartilage constructs, and to describe the short-term outcome of successfully implanted constructs. METHODS: Scaffold-free, autologous cartilage constructs were implanted within superficial osteochondral defects created in the stifle joints of nine adult goats. The implants were distributed between four 6-mm-diameter superficial osteochondral defects created in the trochlea femoris and secured in the defect using a covering periosteal flap (PF) alone or in combination with adhesives (platelet-rich plasma (PRP) or fibrin), or using PRP alone. Eight weeks after implantation surgery, the animals were killed. The defect sites were excised and subjected to macroscopic and histopathologic analyses. RESULTS: At 8 weeks, implants that had been held in place exclusively with a PF were well integrated both laterally and basally. The repair tissue manifested an architecture similar to that of hyaline articular cartilage. However, most of the implants that had been glued in place in the absence of a PF were lost during the initial 4-week phase of restricted joint movement. The use of human fibrin glue (FG) led to massive cell infiltration of the subchondral bone. CONCLUSIONS: The implantation of autologous, scaffold-free, engineered cartilage constructs might best be performed beneath a PF without the use of tissue adhesives. Successfully implanted constructs showed hyaline-like characteristics in adult goats within 2 months. Long-term animal studies and pilot clinical trials are now needed to evaluate the efficacy of this treatment strategy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A patient-specific surface model of the proximal femur plays an important role in planning and supporting various computer-assisted surgical procedures including total hip replacement, hip resurfacing, and osteotomy of the proximal femur. The common approach to derive 3D models of the proximal femur is to use imaging techniques such as computed tomography (CT) or magnetic resonance imaging (MRI). However, the high logistic effort, the extra radiation (CT-imaging), and the large quantity of data to be acquired and processed make them less functional. In this paper, we present an integrated approach using a multi-level point distribution model (ML-PDM) to reconstruct a patient-specific model of the proximal femur from intra-operatively available sparse data. Results of experiments performed on dry cadaveric bones using dozens of 3D points are presented, as well as experiments using a limited number of 2D X-ray images, which demonstrate promising accuracy of the present approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Comments on an article by Kashima et al. (see record 2007-10111-001). In their target article Kashima and colleagues try to show how a connectionist model conceptualization of the self is best suited to capture the self's temporal and socio-culturally contextualized nature. They propose a new model and to support this model, the authors conduct computer simulations of psychological phenomena whose importance for the self has long been clear, even if not formally modeled, such as imitation, and learning of sequence and narrative. As explicated when we advocated connectionist models as a metaphor for self in Mischel and Morf (2003), we fully endorse the utility of such a metaphor, as these models have some of the processing characteristics necessary for capturing key aspects and functions of a dynamic cognitive-affective self-system. As elaborated in that chapter, we see as their principal strength that connectionist models can take account of multiple simultaneous processes without invoking a single central control. All outputs reflect a distributed pattern of activation across a large number of simple processing units, the nature of which depends on (and changes with) the connection weights between the links and the satisfaction of mutual constraints across these links (Rummelhart & McClelland, 1986). This allows a simple account for why certain input features will at times predominate, while others take over on other occasions. (PsycINFO Database Record (c) 2008 APA, all rights reserved)