956 resultados para time of application


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este Proyecto de Fin de Carrera presenta un prototipo de aplicación móvil híbrida multi-plataforma para Android y iOS. Las aplicaciones móviles híbridas son una combinación de aplicaciones web móviles y aplicaciones móviles nativas. Se desarrollan parcialmente con tecnologías web y pueden acceder a la capa nativa y sensores del teléfono. Para el usuario se presentan como aplicaciones nativas, ya que se pueden descargar de las tiendas de aplicaciones y son instaladas en el dispositivo. El prototipo consiste en la migración del módulo de noticias financieras de las aplicaciones actuales para móviles de una compañía bancaria reimplementándolo como aplicación híbrida utilizando uno de los entornos de desarrollo disponibles en el mercado para este propósito. El desarrollo de aplicaciones híbridas puede ahorrar tiempo y dinero cuando se pretende alcanzar más de una plataforma móvil. El objetivo es la evaluación de las ventajas e inconvenientes que ofrece el desarrollo de aplicaciones híbridas en términos de reducción de costes, tiempo de desarrollo y resultado final de la aplicación. El proyecto consta de varias fases. Durante la primera fase se realiza un estudio sobre las aplicaciones híbridas que podemos encontrar hoy en día en el mercado utilizando los ejemplos de linkedIn, Facebook y Financial times. Se hace hincapié en las tecnologías utilizadas, uso de la red móvil y problemas encontrados. Posteriormente se realiza una comparación de distintos entornos de desarrollo multi-plataforma para aplicaciones híbridas en términos de la estrategia utilizada, plataformas soportadas, lenguajes de programación, acceso a capacidades nativas de los dispositivos y licencias de uso. Esta primera fase da como resultado la elección del entorno de desarrollo más adecuado a las exigencias del proyecto, que es PhoneGap, y continua con un análisis más detallado de dicho entorno en cuanto a su arquitectura, características y componentes. La siguiente fase comienza con un estudio de las aplicaciones actuales de la compañía para extraer el código fuente necesario y adaptarlo a la arquitectura que tendrá la aplicación. Para la realización del prototipo se hace uso de la característica que ofrece PhoneGap para acceder a la capa nativa del dispositivo, esto es, el uso de plugins. Se diseña y desarrolla un plugin que permite acceder a la capa nativa para cada plataforma. Una vez desarrollado el prototipo para la plataforma Android, se migra y adapta para la plataforma iOS. Por último se hace una evaluación de los prototipos en cuanto a su facilidad y tiempo de desarrollo, rendimiento, funcionalidad y apariencia de la interfaz de usuario. ABSTRACT. This bachelor's thesis presents a prototype of a hybrid cross-platform mobile application for Android and iOS. Hybrid mobile applications are a combination of mobile web and mobile native applications. They are built partially with web technologies and they can also access native features and sensors of the device. For a user, they look like native applications as they are downloaded from the application stores and installed on the device. This prototype consists of the migration of the financial news module of current mobile applications from a financial bank reimplementing them as a hybrid application using one of the frameworks available in the market for that purpose. Development of applications on a hybrid way can help reducing costs and effort when targeting more than one platform. The target of the project is the evaluation of the advantages and disadvantages that hybrid development can offer in terms of reducing costs and efforts and the final result of the application. The project starts with an analysis of successfully released hybrid applications using the examples of linkedIn, Facebook and Financial Times, emphasizing the different used technologies, the transmitted network data and the encountered problems during the development. This analysis is followed by a comparison of most popular hybrid crossplatform development frameworks in terms of the different approaches, supported platforms, programming languages, access to native features and license. This first stage has the outcome of finding the development framework that best fits to the requirements of the project, that is PhoneGap, and continues with a deeper analysis of its architecture, features and components. Next stage analyzes current company's applications to extract the needed source code and adapt it to the architecture of the prototype. For the realization of the application, the feature that PhoneGap offers to access the native layer of the device is used. This feature is called plugin. A custom plugin is designed and developed to access the native layer of each targeted platform. Once the prototype is finished for Android, it is migrated and adapted to the iOS platform. As a final conclusion the prototypes are evaluated in terms of ease and time of development, performance, functionality and look and feel.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is a known fact that noise analysis is a suitable method for sensor performance surveillance. In particular, controlling the response time of a sensor is an efficient way to anticipate failures and to have the opportunity to prevent them. In this work the response times of several sensors of Trillo NPP are estimated by means of noise analysis. The procedure applied consists of modeling each sensor with autoregressive methods and getting the searched parameter by analyzing the response of the model when a ramp is simulated as the input signal. Core exit thermocouples and in core self-powered neutron detectors are the main sensors analyzed but other plant sensors are studied as well. Since several measurement campaigns have been carried out, it has been also possible to analyze the evolution of the estimated parameters during more than one fuel cycle. Some sensitivity studies for the sample frequency of the signals and its influence on the response time are also included. Calculations and analysis have been done in the frame of a collaboration agreement between Trillo NPP operator (CNAT) and the School of Mines of Madrid.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The combination of minimum time control and multiphase converter is a favorable option for dc-dc converters in applications where output voltage variation is required, such as RF amplifiers and dynamic voltage scaling in microprocessors, due to their advantage of fast dynamic response. In this paper, an improved minimum time control approach for multiphase buck converter that is based on charge balance technique, aiming at fast output voltage transition is presented. Compared with the traditional method, the proposed control takes into account the phase delay and current ripple in each phase. Therefore, by investigating the behavior of multiphase converter during voltage transition, it resolves the problem of current unbalance after the transient, which can lead to long settling time of the output voltage. The restriction of this control is that the output voltage that the converter can provide is related to the number of the phases, because only the duty cycles at which the multiphase converter has total ripple cancellation are used in this approach. The model of the proposed control is introduced, and the design constraints of the buck converters filter for this control are discussed. In order to prove the concept, a four-phase buck converter is implemented and the experimental results that validate the proposed control method are presented. The application of this control to RF envelope tracking is also presented in this paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Three-dimensional kinematic analysis provides quantitative assessment of upper limb motion and is used as an outcome measure to evaluate movement disorders. The aim of the present study is to present a set of kinematic metrics for quantifying characteristics of movement performance and the functional status of the subject during the execution of the activity of daily living (ADL) of drinking from a glass. Then, the objective is to apply these metrics in healthy people and a population with cervical spinal cord injury (SCI), and to analyze the metrics ability to discriminate between healthy and pathologic people. 19 people participated in the study: 7 subjects with metameric level C6 tetraplegia, 4 subjects with metameric level C7 tetraplegia and 8 healthy subjects. The movement was recorded with a photogrammetry system. The ADL of drinking was divided into a series of clearly identifiable phases to facilitate analysis. Metrics describing the time of the reaching phase, the range of motion of the joints analyzed, and characteristics of movement performance such as the efficiency, accuracy and smoothness of the distal segment and inter-joint coordination were obtained. The performance of the drinking task was more variable in people with SCI compared to the control group in relation to the metrics measured. Reaching time was longer in SCI groups. The proposed metrics showed capability to discriminate between healthy and pathologic people. Relative deficits in efficiency were larger in SCI people than in controls. These metrics can provide useful information in a clinical setting about the quality of the movement performed by healthy and SCI people during functional activities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Protein folding occurs on a time scale ranging from milliseconds to minutes for a majority of proteins. Computer simulation of protein folding, from a random configuration to the native structure, is nontrivial owing to the large disparity between the simulation and folding time scales. As an effort to overcome this limitation, simple models with idealized protein subdomains, e.g., the diffusion–collision model of Karplus and Weaver, have gained some popularity. We present here new results for the folding of a four-helix bundle within the framework of the diffusion–collision model. Even with such simplifying assumptions, a direct application of standard Brownian dynamics methods would consume 10,000 processor-years on current supercomputers. We circumvent this difficulty by invoking a special Brownian dynamics simulation. The method features the calculation of the mean passage time of an event from the flux overpopulation method and the sampling of events that lead to productive collisions even if their probability is extremely small (because of large free-energy barriers that separate them from the higher probability events). Using these developments, we demonstrate that a coarse-grained model of the four-helix bundle can be simulated in several days on current supercomputers. Furthermore, such simulations yield folding times that are in the range of time scales observed in experiments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The authors study the timing of leniency applications using a novel application of multi-spell discrete-time survival analysis for a sample of cartels prosecuted by the European Commission between 1996 and 2014. The start of a Commission investigation does not affect the rate by which conspirators apply for leniency in the market investigated, but increases the rate of application in separate markets in which a conspirator in the investigated market also engaged in collusion. The revision of the Commission’s leniency programme in 2002 increased the rate of pre-investigation applications. Our results shed light on enforcement efforts against cartels and other forms of

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Lapeyre-Triflo FURTIVA valve aims at combining the favorable hemodynamics of bioprosthetic heart valves with the durability of mechanical heart valves (MHVs). The pivoting region of MHVs is hemodynamically of special interest as it may be a region of high shear stresses, combined with areas of flow stagnation. Here, platelets can be activated and may form a thrombus which in the most severe case can compromise leaflet mobility. In this study we set up an experiment to replicate the pulsatile flow in the aortic root and to study the flow in the pivoting region under physiological hemodynamic conditions (CO = 4.5 L/min / CO = 3.0 L/min, f = 60 BPM). It was found that the flow velocity in the pivoting region could reach values close to that of the bulk flow during systole. At the onset of diastole the three valve leaflets closed in a very synchronous manner within an average closing time of 55 ms which is much slower than what has been measured for traditional bileaflet MHVs. Hot spots for elevated viscous shear stresses were found at the flanges of the housing and the tips of the leaflet ears. Systolic VSS was maximal during mid-systole and reached levels of up to 40 Pa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Piggery pond sludge (PPS) was applied, as-collected (Wet PPS) and following stockpiling for 12 months ( Stockpiled PPS), to a sandy Sodosol and clay Vertosol at sites on the Darling Downs of Queensland. Laboratory measures of N availability were carried out on unamended and PPS-amended soils to investigate their value in estimating supplementary N needs of crops in Australia's northern grains region. Cumulative net N mineralised from the long-term ( 30 weeks) leached aerobic incubation was described by a first-order single exponential model. The mineralisation rate constant (0.057/week) was not significantly different between Control and PPS treatments or across soil types, when the amounts of initial mineral N applied in PPS treatments were excluded. Potentially mineralisable N (N-o) was significantly increased by the application of Wet PPS, and increased with increasing rate of application. Application of Wet PPS significantly increased the total amount of inorganic N leached compared with the Control treatments. Mineral N applied in Wet PPS contributed as much to the total mineral N status of the soil as did that which mineralised over time from organic N. Rates of CO2 evolution during 30 weeks of aerobic leached incubation indicated that the Stockpiled PPS was more stabilised (19-28% of applied organic C mineralised) than the Wet PPS (35-58% of applied organic C mineralised), due to higher lignin content in the former. Net nitrate-N produced following 12 weeks of aerobic non-leached incubation was highly correlated with net nitrate-N leached during 12 weeks of aerobic incubation (R-2 = 0.96), although it was

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A variety of iron compounds containing vinyl or thiol functional groups (used as photoactivators) have been synthesised and some of these were successfully bound to both polyethylene and polypropylene backbones during processing in the presence of peroxide and interlinking agent. Concentrates (masterbatches) of the photoactivators in PP and PE were prepared and the pro-oxidant effect of the diluted masterbatches in absence and presence of an antioxidant was evaluated. An antioxidant photoactivator (FeDNC ) was found to sensitise the photoactivity of pro-oxidants (Metone A / Metone M) whereas an antioxidant (ZnDNC) was found to stabilise the polymer (PP and PE) containing both of these combinations. It was observed that the lower concentration of FeDNC sensitises the stability of the polymer containing very small concentration of NiDNC whereas higher concentration of FeDNC stabilises the polymer (LDPE) containing same amount of NiDNC compared to FeDNC alone. The photostability of unstabilised PP containing FeAc could be varied by varying the concentration of ZnDEC. Both the induction period and the UV - life time of the polymer increased by increasing concentration of ZnDEC. It is suggested that ligand exchange reaction may take place between FeAc and ZnDNC. A polymer bound UV stabiliser (HAEB) and a thermal stabiliser (DBBA) were used with a non extractable photoactivator (FeAc) in PP. Small concentrations of the stabilisers (HAEB and DBBA) in combination with the photoactivator (FeAc) sensitise the polymer. The antioxidant present in commercial polymer (LDPE and PP) was found to be of a hindered phenol type, which was found to antagonise with ZnDNC when used in combination with the photoactivators.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this thesis is to examine the experience of time of four professional occupational groups working in public sector organisations and the factors affecting this experience. The literature on time and work is examined to delineate the key parameters of research in this area. A broad organisation behaviour approach to the experience of time and work is developed in which individual, occupational, organisational and socio-political factors are inter-related. The experience of secondary school teachers, further education lecturers, general medical practitioners and hosoital consultants is then examined. Multiple methods of data collection are used: open-ended interviews, a questionnaire survey and the analysis of key documents relating to the institutional settings in which the four groups work. The research aims to develop our knowledge of working time by considering the dimensions of the experience of time at work, the contexts in wlhich this experience is generated and the constraints these contexts give rIse to. By developing our understanding of time as a key feature of work experience we also extend our knowledge of organisation behaviour in general. In conclusion a model of the factors relating the experience of time to the negotiation of time at work is presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this Interdisciplinary Higher Degrees project was the development of a high-speed method of photometrically testing vehicle headlamps, based on the use of image processing techniques, for Lucas Electrical Limited. Photometric testing involves measuring the illuminance produced by a lamp at certain points in its beam distribution. Headlamp performance is best represented by an iso-lux diagram, showing illuminance contours, produced from a two-dimensional array of data. Conventionally, the tens of thousands of measurements required are made using a single stationary photodetector and a two-dimensional mechanical scanning system which enables a lamp's horizontal and vertical orientation relative to the photodetector to be changed. Even using motorised scanning and computerised data-logging, the data acquisition time for a typical iso-lux test is about twenty minutes. A detailed study was made of the concept of using a video camera and a digital image processing system to scan and measure a lamp's beam without the need for the time-consuming mechanical movement. Although the concept was shown to be theoretically feasible, and a prototype system designed, it could not be implemented because of the technical limitations of commercially-available equipment. An alternative high-speed approach was developed, however, and a second prototype syqtem designed. The proposed arrangement again uses an image processing system, but in conjunction with a one-dimensional array of photodetectors and a one-dimensional mechanical scanning system in place of a video camera. This system can be implemented using commercially-available equipment and, although not entirely eliminating the need for mechanical movement, greatly reduces the amount required, resulting in a predicted data acquisiton time of about twenty seconds for a typical iso-lux test. As a consequence of the work undertaken, the company initiated an 80,000 programme to implement the system proposed by the author.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Numerous techniques have been developed to control cost and time of construction projects. However, there is limited research on issues surrounding the practical usage of these techniques. To address this, a survey was conducted on the top 150 construction companies and 100 construction consultancies in the UK aimed at identifying common project control practices and factors inhibiting effective project control in practice. It found that despite the vast application of control techniques a high proportion of respondents still experienced cost and time overruns on a significant proportion of their projects. Analysis of the survey results concluded that more effort should be geared at the management of the identified top project control inhibiting factors. This paper has outlined some measures for mitigating these inhibiting factors so that the outcome of project time and cost control can be improved in practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work we experimentally investigate the response time of humidity sensors based on polymer optical fiber Bragg gratings. By the use of etching with acetone we can control the poly (methyl methacrylate) based fiber in order to reduce the diffusion time of water into the polymer and hence speed up the relative wavelength change caused by humidity variations. A much improved response time of 12 minutes for humidity decrease and 7 minutes for humidity increase, has been achieved by using a polymer optical fiber Bragg grating with a reduced diameter of 135 microns.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Models at runtime can be defined as abstract representations of a system, including its structure and behaviour, which exist in tandem with the given system during the actual execution time of that system. Furthermore, these models should be causally connected to the system being modelled, offering a reflective capability. Significant advances have been made in recent years in applying this concept, most notably in adaptive systems. In this paper we argue that a similar approach can also be used to support the dynamic generation of software artefacts at execution time. An important area where this is relevant is the generation of software mediators to tackle the crucial problem of interoperability in distributed systems. We refer to this approach as emergent middleware, representing a fundamentally new approach to resolving interoperability problems in the complex distributed systems of today. In this context, the runtime models are used to capture meta-information about the underlying networked systems that need to interoperate, including their interfaces and additional knowledge about their associated behaviour. This is supplemented by ontological information to enable semantic reasoning. This paper focuses on this novel use of models at runtime, examining in detail the nature of such runtime models coupled with consideration of the supportive algorithms and tools that extract this knowledge and use it to synthesise the appropriate emergent middleware.