901 resultados para travel time reliability


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background. Not only has obesity played a role in Texas adults but it is also becoming a large issue among low-income Latino children. In Latino children between 2-5 years of age, the Pediatric Nutrition Surveillance data in 1997 found the prevalence of obesity was 12 percent, highest among all ethnic groups. Children learn what and how to eat from their environment. Despite many mothers being working mothers they are still the principal caregivers and source of influence on their toddler's diet. Self-efficacy, a concept created by Albert Bandura, one's belief that one is capable of performing a behavior needed to reach an intended goal, is increasingly becoming important in nutrition and health education. This study is important to understand the degree of impact that a mother's self-efficacy will have on a child's diet. This is useful knowing if influencing a mother's self-efficacy could improve a child's diet to prevent certain public health issues such as obesity and diabetes. The purpose of this study was to examine nutrition self-efficacy of Latina mothers, focusing on sweets and beverage and if their self-efficacy impacted their child's diet. Methods. The data was collected during July-September 2008. Mothers were recruited from two federally qualified San Antonio health centers. In order to qualify, participants had to be Hispanic with children of toddler age. Mothers were informed of incentives available upon completion. The interview consisted of demographic info, a set of five self-efficacy questions repeated at completion, testing reliability and a 24-hour food recall diary asked of the participant's child's diet. Results. There were 225 mothers who participated between both clinics. The Crohnbach alpha scores for the two different times the self-efficacy questions were asked were .44 corresponding to the first time and .49 for the second time. The three most common beverages reported were milk, juice, and water. The mothers who met or gave their child more milk than recommended by the scientific community, 800mg of calcium/3 cups (24oz) set, had a higher self-efficacy score than those who did not meet the standard at all. Mothers who gave their children more juice than the standard recommends, 4-6oz for children 1-6 years of age, had slightly higher self-efficacy scores than mother's who simply met the standard. In general, the lower the mother's self-efficacy, the more sweets they gave their child and vice versa. Conclusion. This study's Kappa values were adequate and this research showed that Latina mothers did in fact have high self-efficacy. In general some of the children's diets did not reflect the current scientific nutrition recommendations. In order to improve self-efficacy and have an impact on children's diets, the scientific community has a responsibility to make recommendations that are easily understood and can be put into practice. The public health community needs to ensure that we encourage those we serve to be more active in their health and educate them about what constitutes good health and nutrition for both themselves and their children.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Surgeon General recommends preschoolers 3-5 years old accumulate 60 minutes of moderate-to-vigorous physical activity (MVPA) per day. However, there is limited data measuring physical activity (PA) and MVPA amongst this population. The purpose of this cross-sectional study is to determine the validity, reliability, and feasibility of using MVP 4 Function Walk4Life digital pedometers (MVP-4) in measuring MVPA among preschoolers using the newly modified direct observational technique, System for Observing Fitness Instruction Time-Preschool Version (SOFIT-P) as the gold standard. An ethnically diverse population of 3-5 year old underserved children were recruited from two Harris County Department of Education (HCDE) Head Start centers. For 2 days at baseline and 2 days at post-test, 75 children enrolled wore MVP-4 pedometers for approximately 6-hours per observation day and were observed using SOFIT-P during predominantly active times. Statistical analyses used Pearson "r" correlation coefficients to determine mean minutes of PA and MVPA, convergent and criterion validity, and reliability. Significance was set at p = <0.05. Feasibility was determined through process evaluation information collected during this study via observations from data collectors and teacher input. Results show mean minutes of PA and MVPA ranged between 30-42 and 11-14 minutes, respectively. Convergent validity comparing BMI percentiles with MVP-4 PA outcomes show no significance at pre-test; however, each measurement at post-test showed significance for MVPA (p = 0.0247, p = 0.0056), respectively. Criterion validity comparing percent MVPA time between SOFIT-P and MVP-4 pedometers was determined; however, results deemed insufficient due to inconsistency in observation times while using the newly developed SOFIT-P. Reliability measures show no significance at pre-test, yet show significant results for all PA outcomes at post-test (p = 0.001, p = 0.001, p = 0.0010, p = 0.003), respectively. Finally, MVP-4 pedometers lacked feasibility due to logistical barriers in design. Researchers feel the significant results at post-test are secondary to increased familiarity and more accurate placement of pedometers across time. Researchers suggest manufacturers of MVP-4 pedometers further modify the instrument for ease of use with this population, following which future studies ought to determine validity using objective measures or all-day direct observation techniques.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Physical activity is a key component of life-style modification process which helps to reduce the risk of developing chronic diseases. It is important to have accurate estimates of physical activity to identify sedentary populations where interventions might be helpful. The International Physical Activity Questionnaire (IPAQ) short version has been used to estimate physical activity in diverse populations. However, there is little literature depicting the use of the IPAQ short version in Mexican America population. This study addressed the predictive validity and test-retest reliability of the IPAQ short version in Mexican American adults. The analysis was performed on 97 participants enrolled in the Cameron County Hispanic Cohort. Individuals selected in this study were 18 years of age or older. The predictive validity was evaluated by studying the relationship between physical activity and biomarkers known to be correlated with physical activity, namely, TNF-α, Adiponectin, and HDL. Multiple linear regression analysis was performed to delineate predictive validity. To assess test-retest reliability, two IPAQ-short last seven days questionnaires were interviewer administered to the participants on the same day, approximately two hours apart. Test-Retest reliability of IPAQ was estimated by performing intraclass correlations between the readings at two different time periods. The study showed that the IPAQ – short version used in the above study had acceptable test-retest reliability in the Mexican American population. This study showed that the IPAQ – short version did not have acceptable predictive validity when looking at physical activity and TNF-α, Adiponectin, and HDL in this sample.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the summer of 2003, a ground-penetrating radar survey around the North Greenland Icecore Project (NorthGRIP) deep ice-core drilling site (75°06' N, 42°20' W; 2957 m a.s.l.) was carried out using a shielded 250 MHz radar system. The drill site is located on an ice divide, roughly 300 km north-northwest of the summit of the Greenland ice sheet. More than 430 km of profiles were measured, covering a 10 km by 10 km area, with a grid centered on the drilling location, and eight profiles extending beyond this grid. Seven internal horizons within the upper 120 m of the ice sheet were continuously tracked, containing the last 400 years of accumulation history. Based on the age-depth and density-depth distribution of the deep core, the internal layers have been dated and the regional and temporal distribution of accumulation rate in the vicinity of NorthGRIP has been derived. The distribution of accumulation shows a relatively smoothly increasing trend from east to west from 145 kg/m**2/a to 200 kg/m**2/a over a distance of 50 km across the ice divide. The general trend is overlain by small-scale variations on the order of 2.5 kg/m**2/a/km, i.e. around 1.5% of the accumulation mean. The temporal variations of the seven periods defined by the seven tracked isochrones are on the order of +-4% of the mean of the last 400 years, i.e. at NorthGRIP ±7 kg/m**2/a. If the regional accumulation pattern has been stable for the last several thousand years during the Holocene, and ice flow has been comparable to today, advective effects along the particle trajectory upstream of NorthGRIP do not have a significant effect on the interpretation of climatically induced changes in accumulation rates derived from the deep ice core over the last 10 kyr.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Synthetic seismograms are constructed from check shot-corrected velocity and density measurements collected during Ocean Drilling Program (ODP) Leg 180 at Sites 1109, 1115, and 1118. The synthetic seismograms facilitate direct correlation of a coincident multichannel seismic (MCS) profile with borehole data collected at the three sites. The MCS data and the synthetic seismograms correlate very well, with most major reflectors successfully reproduced in the synthetics. Our results enable a direct calibration of the MCS data in terms of age, paleoenvironment, and subsidence history. Seismic reflectors are time correlative within stratigraphic resolution but are often observed to result from different lithologies across strike. Our results facilitate the extrapolation of the sedimentation history into an unsampled section of Site 1118 and enable a full correlation between the three sites using all the data collected during ODP Leg 180. This study forms the foundation for regionalizing the site data to the northern margin of the Woodlark Basin, where the transition from continental rifting to seafloor spreading is taking place.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Homogenized data series of total ozone measurements taken by the regularly and well calibrated Dobson and Brewer spectrophotometers at Hradec Králové (Czech) and the data from the re-analyses ERA-40 and ERA-Interim were assimilated and combined to investigate differences between the particular data sets over Central Europe, the NH mid-latitudes. The Dobson-to-Brewer transfer function and the algorithm for approximation of the data from the re-analyses were developed, tested and applied for creation of instrumentally consistent and completed total ozone data series of the 50-year period 1961-2010 of observations. The assimilation has reduced the well-known seasonal differences between Dobson and Brewer data below the 1% calibration limit of the spectrophotometers. Incorporation of the ERA-40 and ERA-Interim total ozone data on days with missing measurements significantly improved completeness and reliability of the data series mainly in the first two decades of the period concerned. Consistent behaviour of the original and assimilated data sets was found in the pre-ozone-hole period (1961-1985). In the post-Pinatubo (1994-2010) era the data series show seasonal differences that can introduce uncertainty in estimation of ozone recovery mainly in the winter-spring season when the effect of the Montreal Protocol and its Amendments is expected. All the data sets confirm substantial depletion of ozone also in the summer months that gives rise to the question about its origin. The assimilated and completed data series of total ozone will be further analyzed to quantify chemical ozone losses and contribution of natural atmospheric processes to the ozone depletion over the region. This case study points out importance of selection and evaluation of the quality and consistency of the input data sets used in estimation of long-term ozone changes including recovery of the ozone layer over the selected areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Distributed real-time embedded systems are becoming increasingly important to society. More demands will be made on them and greater reliance will be placed on the delivery of their services. A relevant subset of them is high-integrity or hard real-time systems, where failure can cause loss of life, environmental harm, or significant financial loss. Additionally, the evolution of communication networks and paradigms as well as the necessity of demanding processing power and fault tolerance, motivated the interconnection between electronic devices; many of the communications have the possibility of transferring data at a high speed. The concept of distributed systems emerged as systems where different parts are executed on several nodes that interact with each other via a communication network. Java’s popularity, facilities and platform independence have made it an interesting language for the real-time and embedded community. This was the motivation for the development of RTSJ (Real-Time Specification for Java), which is a language extension intended to allow the development of real-time systems. The use of Java in the development of high-integrity systems requires strict development and testing techniques. However, RTJS includes a number of language features that are forbidden in such systems. In the context of the HIJA project, the HRTJ (Hard Real-Time Java) profile was developed to define a robust subset of the language that is amenable to static analysis for high-integrity system certification. Currently, a specification under the Java community process (JSR- 302) is being developed. Its purpose is to define those capabilities needed to create safety critical applications with Java technology called Safety Critical Java (SCJ). However, neither RTSJ nor its profiles provide facilities to develop distributed realtime applications. This is an important issue, as most of the current and future systems will be distributed. The Distributed RTSJ (DRTSJ) Expert Group was created under the Java community process (JSR-50) in order to define appropriate abstractions to overcome this problem. Currently there is no formal specification. The aim of this thesis is to develop a communication middleware that is suitable for the development of distributed hard real-time systems in Java, based on the integration between the RMI (Remote Method Invocation) model and the HRTJ profile. It has been designed and implemented keeping in mind the main requirements such as the predictability and reliability in the timing behavior and the resource usage. iThe design starts with the definition of a computational model which identifies among other things: the communication model, most appropriate underlying network protocols, the analysis model, and a subset of Java for hard real-time systems. In the design, the remote references are the basic means for building distributed applications which are associated with all non-functional parameters and resources needed to implement synchronous or asynchronous remote invocations with real-time attributes. The proposed middleware separates the resource allocation from the execution itself by defining two phases and a specific threading mechanism that guarantees a suitable timing behavior. It also includes mechanisms to monitor the functional and the timing behavior. It provides independence from network protocol defining a network interface and modules. The JRMP protocol was modified to include two phases, non-functional parameters, and message size optimizations. Although serialization is one of the fundamental operations to ensure proper data transmission, current implementations are not suitable for hard real-time systems and there are no alternatives. This thesis proposes a predictable serialization that introduces a new compiler to generate optimized code according to the computational model. The proposed solution has the advantage of allowing us to schedule the communications and to adjust the memory usage at compilation time. In order to validate the design and the implementation a demanding validation process was carried out with emphasis in the functional behavior, the memory usage, the processor usage (the end-to-end response time and the response time in each functional block) and the network usage (real consumption according to the calculated consumption). The results obtained in an industrial application developed by Thales Avionics (a Flight Management System) and in exhaustive tests show that the design and the prototype are reliable for industrial applications with strict timing requirements. Los sistemas empotrados y distribuidos de tiempo real son cada vez más importantes para la sociedad. Su demanda aumenta y cada vez más dependemos de los servicios que proporcionan. Los sistemas de alta integridad constituyen un subconjunto de gran importancia. Se caracterizan por que un fallo en su funcionamiento puede causar pérdida de vidas humanas, daños en el medio ambiente o cuantiosas pérdidas económicas. La necesidad de satisfacer requisitos temporales estrictos, hace más complejo su desarrollo. Mientras que los sistemas empotrados se sigan expandiendo en nuestra sociedad, es necesario garantizar un coste de desarrollo ajustado mediante el uso técnicas adecuadas en su diseño, mantenimiento y certificación. En concreto, se requiere una tecnología flexible e independiente del hardware. La evolución de las redes y paradigmas de comunicación, así como la necesidad de mayor potencia de cómputo y de tolerancia a fallos, ha motivado la interconexión de dispositivos electrónicos. Los mecanismos de comunicación permiten la transferencia de datos con alta velocidad de transmisión. En este contexto, el concepto de sistema distribuido ha emergido como sistemas donde sus componentes se ejecutan en varios nodos en paralelo y que interactúan entre ellos mediante redes de comunicaciones. Un concepto interesante son los sistemas de tiempo real neutrales respecto a la plataforma de ejecución. Se caracterizan por la falta de conocimiento de esta plataforma durante su diseño. Esta propiedad es relevante, por que conviene que se ejecuten en la mayor variedad de arquitecturas, tienen una vida media mayor de diez anos y el lugar ˜ donde se ejecutan puede variar. El lenguaje de programación Java es una buena base para el desarrollo de este tipo de sistemas. Por este motivo se ha creado RTSJ (Real-Time Specification for Java), que es una extensión del lenguaje para permitir el desarrollo de sistemas de tiempo real. Sin embargo, RTSJ no proporciona facilidades para el desarrollo de aplicaciones distribuidas de tiempo real. Es una limitación importante dado que la mayoría de los actuales y futuros sistemas serán distribuidos. El grupo DRTSJ (DistributedRTSJ) fue creado bajo el proceso de la comunidad de Java (JSR-50) con el fin de definir las abstracciones que aborden dicha limitación, pero en la actualidad aun no existe una especificacion formal. El objetivo de esta tesis es desarrollar un middleware de comunicaciones para el desarrollo de sistemas distribuidos de tiempo real en Java, basado en la integración entre el modelo de RMI (Remote Method Invocation) y el perfil HRTJ. Ha sido diseñado e implementado teniendo en cuenta los requisitos principales, como la predecibilidad y la confiabilidad del comportamiento temporal y el uso de recursos. El diseño parte de la definición de un modelo computacional el cual identifica entre otras cosas: el modelo de comunicaciones, los protocolos de red subyacentes más adecuados, el modelo de análisis, y un subconjunto de Java para sistemas de tiempo real crítico. En el diseño, las referencias remotas son el medio básico para construcción de aplicaciones distribuidas las cuales son asociadas a todos los parámetros no funcionales y los recursos necesarios para la ejecución de invocaciones remotas síncronas o asíncronas con atributos de tiempo real. El middleware propuesto separa la asignación de recursos de la propia ejecución definiendo dos fases y un mecanismo de hebras especifico que garantiza un comportamiento temporal adecuado. Además se ha incluido mecanismos para supervisar el comportamiento funcional y temporal. Se ha buscado independencia del protocolo de red definiendo una interfaz de red y módulos específicos. También se ha modificado el protocolo JRMP para incluir diferentes fases, parámetros no funcionales y optimizaciones de los tamaños de los mensajes. Aunque la serialización es una de las operaciones fundamentales para asegurar la adecuada transmisión de datos, las actuales implementaciones no son adecuadas para sistemas críticos y no hay alternativas. Este trabajo propone una serialización predecible que ha implicado el desarrollo de un nuevo compilador para la generación de código optimizado acorde al modelo computacional. La solución propuesta tiene la ventaja que en tiempo de compilación nos permite planificar las comunicaciones y ajustar el uso de memoria. Con el objetivo de validar el diseño e implementación se ha llevado a cabo un exigente proceso de validación con énfasis en: el comportamiento funcional, el uso de memoria, el uso del procesador (tiempo de respuesta de extremo a extremo y en cada uno de los bloques funcionales) y el uso de la red (consumo real conforme al estimado). Los buenos resultados obtenidos en una aplicación industrial desarrollada por Thales Avionics (un sistema de gestión de vuelo) y en las pruebas exhaustivas han demostrado que el diseño y el prototipo son fiables para aplicaciones industriales con estrictos requisitos temporales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents some of the results of a method to determine the main reliability functions of concentrator solar cells. High concentrator GaAs single junction solar cells have been tested in an Accelerated Life Test. The method can be directly applied to multi-junction solar cells. The main conclusions of this test carried out show that these solar cells are robust devices with a very low probability of failure caused by degradation during their operation life (more than 30 years). The evaluation of the probability operation function (i.e. the reliability function R(t)) is obtained for two nominal operation conditions of these cells, namely simulated concentration ratios of 700 and 1050 suns. Preliminary determination of the Mean Time to Failure indicates a value much higher than the intended operation life time of the concentrator cells.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Los sistemas de concentración fotovoltaica (CPV) parecen ser una de las vías más prometedoras para generar electricidad a gran escala a precios competitivos. La investigación actual se centra en aumentar la eficiencia y la concentración de los sistemas para abaratar costes. Al mismo tiempo se investiga sobre la fiabilidad de los diferentes componentes que integran un sistema de concentración, ya que para que los sistemas de concentración sean competitivos es necesario que tengan una fiabilidad al menos similar a los sistemas basados en células de silicio. En la presente tesis doctoral se ha llevado a cabo el estudio de aspectos avanzados de células solares multi-unión diseñadas para trabajar a concentraciones ultra-altas. Para ello, se ha desarrollado un modelo circuital tridimensional distribuido con el que simular el comportamiento de las células solares triple-unión bajo distintas condiciones de funcionamiento, así mismo se ha realizado una caracterización avanzada de este tipo de células para comprender mejor su modo de operación y así poder contribuir a mejorar su eficiencia. Finalmente, se han llevado a cabo ensayos de vida acelerados en células multiunión comerciales para conocer la fiabilidad de este tipo de células solares. Para la simulación de células solares triple-unión se ha desarrollado en la presente tesis doctoral un modelo circuital tridimensinal distribuido el cuál integra una descripción completa de la unión túnel. De este modo, con el modelo desarrollado, hemos podido simular perfiles de luz sobre la célula solar que hacen que la densidad de corriente fotogenerada sea mayor a la densidad de corriente pico de la unión túnel. El modelo desarrollado también contempla la distribución lateral de corriente en las capas semiconductoras que componen y rodean la unión túnel. Por tanto, se ha podido simular y analizar el efecto que tiene sobre el funcionamiento de la célula solar que los concentradores ópticos produzcan perfiles de luz desuniformes, tanto en nivel de irradiancia como en el contenido espectral de la luz (aberración cromática). Con el objetivo de determinar cuáles son los mecanismos de recombinación que están limitando el funcionamiento de cada subcélula que integra una triple-unión, y así intentar reducirlos, se ha llevado a cabo la caracterización eléctrica de células solares monouni ón idénticas a las subcelulas de una triple-unión. También se ha determinado la curva corriente-tensión en oscuridad de las subcélulas de GaInP y GaAs de una célula dobleunión mediante la utilización de un teorema de reciprocidad electro-óptico. Finalmente, se ha analizado el impacto de los diferentes mecanismos de recombinación en el funcionamiento de la célula solar triple-unión en concentración. Por último, para determinar la fiabilidad de este tipo de células, se ha llevado a cabo un ensayo de vida acelerada en temperatura en células solares triple-unión comerciales. En la presente tesis doctoral se describe el diseño del ensayo, el progreso del mismo y los datos obtenidos tras el análisis de los resultados preliminares. Abstract Concentrator photovoltaic systems (CPV) seem to be one of the most promising ways to generate electricity at competitive prices. Nowadays, the research is focused on increasing the efficiency and the concentration of the systems in order to reduce costs. At the same time, another important area of research is the study of the reliability of the different components which make up a CPV system. In fact, in order for a CPV to be cost-effective, it should have a warranty at least similar to that of the systems based on Si solar cells. In the present thesis, we will study in depth the behavior of multijunction solar cells under ultra-high concentration. With this purpose in mind, a three-dimensional circuital distributed model which is able to simulate the behavior of triple-junction solar cells under different working conditions has been developed. Also, an advanced characterization of these solar cells has been carried out in order to better understand their behavior and thus contribute to improving efficiency. Finally, accelerated life tests have been carried out on commercial lattice-matched triple-junction solar cells in order to determine their reliability. In order to simulate triple-junction solar cells, a 3D circuital distributed model which integrates a full description of the tunnel junction has been developed. We have analyzed the behavior of the multijunction solar cell under light profiles which cause the current density photo-generated in the solar cell to be higher than the tunnel junction’s peak current density. The advanced model developed also takes into account the lateral current spreading through the semiconductor layers which constitute and surround the tunnel junction. Therefore, the effects of non-uniform light profiles, in both irradiance and the spectral content produced by the concentrators on the solar cell, have been simulated and analyzed. In order to determine which recombination mechanisms are limiting the behavior of each subcell in a triple-junction stack, and to try to reduce them when possible, an electrical characterization of single-junction solar cells that resemble the subcells in a triplejunction stack has been carried out. Also, the dark I-V curves of the GaInP and GaAs subcells in a dual-junction solar cell have been determined by using an electro-optical reciprocity theorem. Finally, the impact of the different recombination mechanisms on the behavior of the triple-junction solar cell under concentration has been analyzed. In order to determine the reliability of these solar cells, a temperature accelerated life test has been carried out on commercial triple-junction solar cells. In the present thesis, the design and the evolution of the test, as well as the data obtained from the analysis of the preliminary results, are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the last years cities around the world have invested important quantities of money in measures for reducing congestion and car-trips. Investments which are nothing but potential solutions for the well-known urban sprawl phenomenon, also called the “development trap” that leads to further congestion and a higher proportion of our time spent in slow moving cars. Over the path of this searching for solutions, the complex relationship between urban environment and travel behaviour has been studied in a number of cases. The main question on discussion is, how to encourage multi-stop tours? Thus, the objective of this paper is to verify whether unobserved factors influence tour complexity. For this purpose, we use a data-base from a survey conducted in 2006-2007 in Madrid, a suitable case study for analyzing urban sprawl due to new urban developments and substantial changes in mobility patterns in the last years. A total of 943 individuals were interviewed from 3 selected neighbourhoods (CBD, urban and suburban). We study the effect of unobserved factors on trip frequency. This paper present the estimation of an hybrid model where the latent variable is called propensity to travel and the discrete choice model is composed by 5 alternatives of tour type. The results show that characteristics of the neighbourhoods in Madrid are important to explain trip frequency. The influence of land use variables on trip generation is clear and in particular the presence of commercial retails. Through estimation of elasticities and forecasting we determine to what extent land-use policy measures modify travel demand. Comparing aggregate elasticities with percentage variations, it can be seen that percentage variations could lead to inconsistent results. The result shows that hybrid models better explain travel behavior than traditional discrete choice models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ATM, SDH or satellite have been used in the last century as the contribution network of Broadcasters. However the attractive price of IP networks is changing the infrastructure of these networks in the last decade. Nowadays, IP networks are widely used, but their characteristics do not offer the level of performance required to carry high quality video under certain circumstances. Data transmission is always subject to errors on line. In the case of streaming, correction is attempted at destination, while on transfer of files, retransmissions of information are conducted and a reliable copy of the file is obtained. In the latter case, reception time is penalized because of the low priority this type of traffic on the networks usually has. While in streaming, image quality is adapted to line speed, and line errors result in a decrease of quality at destination, in the file copy the difference between coding speed vs line speed and errors in transmission are reflected in an increase of transmission time. The way news or audiovisual programs are transferred from a remote office to the production centre depends on the time window and the type of line available; in many cases, it must be done in real time (streaming), with the resulting image degradation. The main purpose of this work is the workflow optimization and the image quality maximization, for that reason a transmission model for multimedia files adapted to JPEG2000, is described based on the combination of advantages of file transmission and those of streaming transmission, putting aside the disadvantages that these models have. The method is based on two patents and consists of the safe transfer of the headers and data considered to be vital for reproduction. Aside, the rest of the data is sent by streaming, being able to carry out recuperation operations and error concealment. Using this model, image quality is maximized according to the time window. In this paper, we will first give a briefest overview of the broadcasters requirements and the solutions with IP networks. We will then focus on a different solution for video file transfer. We will take the example of a broadcast center with mobile units (unidirectional video link) and regional headends (bidirectional link), and we will also present a video file transfer file method that satisfies the broadcaster requirements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main goal of this proposal is to join together the owners of the most advanced CPV technology, with respect to the state of the art, in order to research from its leading position new applications for CPV systems. In addition to opening up new markets, it will unveil possible sources of failure in new environments outside Europe, in order to assure component reliability. The proposed project will also try to improve the current technology of the industrial partners (ISOFOTON and CONCENTRIX) by accelerating the learning curve that CPV must follow in order to reach the competitive market, and lowering the cost under the current flat panel PV significantly within 3-4 years. The use of CPV systems in remote areas, together with harsher radiation, ambient and infrastructure conditions will help to increase the rate of progress of this technology. In addition, the ISFOC s contribution, which brings together seven power plants from seven CPV technologies up to 3 MWpeak, will allow creating the most complete database of components and systems performance to be generated as well as the effects of radiation and meteorology on systems operations. Finally, regarding the new applications for CPV subject, the project will use a CPV system sized 25 kWp in a stand-alone station in Egypt (NWRC) for the first time for water pumping and irrigation purposes. In a similar way ISOFOTON will connect up to 25 kWp CPV to the Moroccan ONE utility grid. From the research content point of view of this project, which is directly addressed by the scope of the call, the cooperative research between UPM, FhG-ISE and the two companies will be favoured by the fact that all are progressing in similar directions: developing two-stage optics CPV systems. In addition to these technology improvements the UPM is very interested in developing a new concept of module, recently patented, which will fulfil all required characteristics of a good CPV with less components and reducing cost.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently a new recipe for developing and deploying real-time systems has become increasingly adopted in the JET tokamak. Powered by the advent of x86 multi-core technology and the reliability of the JET’s well established Real-Time Data Network (RTDN) to handle all real-time I/O, an official Linux vanilla kernel has been demonstrated to be able to provide realtime performance to user-space applications that are required to meet stringent timing constraints. In particular, a careful rearrangement of the Interrupt ReQuests’ (IRQs) affinities together with the kernel’s CPU isolation mechanism allows to obtain either soft or hard real-time behavior depending on the synchronization mechanism adopted. Finally, the Multithreaded Application Real-Time executor (MARTe) framework is used for building applications particularly optimised for exploring multicore architectures. In the past year, four new systems based on this philosophy have been installed and are now part of the JET’s routine operation. The focus of the present work is on the configuration and interconnection of the ingredients that enable these new systems’ real-time capability and on the impact that JET’s distributed real-time architecture has on system engineering requirements, such as algorithm testing and plant commissioning. Details are given about the common real-time configuration and development path of these systems, followed by a brief description of each system together with results regarding their real-time performance. A cycle time jitter analysis of a user-space MARTe based application synchronising over a network is also presented. The goal is to compare its deterministic performance while running on a vanilla and on a Messaging Real time Grid (MRG) Linux kernel.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Look-up tables are collected and analysed for 12 European National Travel Surveys (NTS) in a harmonized way covering the age group 13-84 year. Travel behaviour measured as kilometres, time use and trips per traveller is compared. Trips per traveller are very similar over the countries whereas kilometres differ most, from minus 28% for Spain to plus 19% and 14% for Sweden and Finland. It is shown that two main factors for differences are GDP per capita and density in the urban areas. The latter is the main reason for the low level in Spain. Mode share is except for Spain with a very high level of walking trips rather similar with a higher level of cycling in the Netherlands, more public transport in Switzerland, and more air traffic in Sweden. Normally kilometres per respondent/inhabitant is used for national planning purpose and this is very affected by the share of mobile travellers. The immobile share is varying between 8 and 28% with 6 NTS at a 15-17% level. These differences are analysed and discussed and it is concluded that the immobile share should be a little less than 15-17% because it is assessed that some short trips might have been forgotten in these 6 countries. The share has a downward tendency with higher density. The resulting immobile share is very dependent on data collection methodology, sampling method, quality of interviewer felt-work etc. The paper shows other possibilities to improve local surveys based on comparison with other countries.