935 resultados para Mobile technologies
Resumo:
Background It has been hypothesized that children and adolescents might be more vulnerable to possible health effects from mobile phone exposure than adults. We investigated whether mobile phone use is associated with brain tumor risk among children and adolescents. Methods CEFALO is a multicenter case-control study conducted in Denmark, Sweden, Norway, and Switzerland that includes all children and adolescents aged 7-19 years who were diagnosed with a brain tumor between 2004 and 2008. We conducted interviews, in person, with 352 case patients (participation rate: 83%) and 646 control subjects (participation rate: 71%) and their parents. Control subjects were randomly selected from population registries and matched by age, sex, and geographical region. We asked about mobile phone use and included mobile phone operator records when available. Odds ratios (ORs) for brain tumor risk and 95% confidence intervals (CIs) were calculated using conditional logistic regression models. Results Regular users of mobile phones were not statistically significantly more likely to have been diagnosed with brain tumors compared with nonusers (OR = 1.36; 95% CI = 0.92 to 2.02). Children who started to use mobile phones at least 5 years ago were not at increased risk compared with those who had never regularly used mobile phones (OR = 1.26, 95% CI = 0.70 to 2.28). In a subset of study participants for whom operator recorded data were available, brain tumor risk was related to the time elapsed since the mobile phone subscription was started but not to amount of use. No increased risk of brain tumors was observed for brain areas receiving the highest amount of exposure. Conclusion The absence of an exposure-response relationship either in terms of the amount of mobile phone use or by localization of the brain tumor argues against a causal association.
Resumo:
BACKGROUND: Assisted reproductive technology (ART) involves the manipulation of early embryos at a time when they may be particularly vulnerable to external disturbances. Environmental influences during the embryonic and fetal development influence the individual's susceptibility to cardiovascular disease, raising concerns about the potential consequences of ART on the long-term health of the offspring. METHODS AND RESULTS: We assessed systemic (flow-mediated dilation of the brachial artery, pulse-wave velocity, and carotid intima-media thickness) and pulmonary (pulmonary artery pressure at high altitude by Doppler echocardiography) vascular function in 65 healthy children born after ART and 57 control children. Flow-mediated dilation of the brachial artery was 25% smaller in ART than in control children (6.7±1.6% versus 8.6±1.7%; P<0.0001), whereas endothelium-independent vasodilation was similar in the 2 groups. Carotid-femoral pulse-wave velocity was significantly (P<0.001) faster and carotid intima-media thickness was significantly (P<0.0001) greater in children conceived by ART than in control children. The systolic pulmonary artery pressure at high altitude (3450 m) was 30% higher (P<0.001) in ART than in control children. Vascular function was normal in children conceived naturally during hormonal stimulation of ovulation and in siblings of ART children who were conceived naturally. CONCLUSIONS: Healthy children conceived by ART display generalized vascular dysfunction. This problem does not appear to be related to parental factors but to the ART procedure itself. CLINICAL TRIAL REGISTRATION: URL: www.clinicaltrials.gov. Unique identifier: NCT00837642.
Resumo:
A mobile ad hoc network (MANET) is a decentralized and infrastructure-less network. This thesis aims to provide support at the system-level for developers of applications or protocols in such networks. To do this, we propose contributions in both the algorithmic realm and in the practical realm. In the algorithmic realm, we contribute to the field by proposing different context-aware broadcast and multicast algorithms in MANETs, namely six-shot broadcast, six-shot multicast, PLAN-B and ageneric algorithmic approach to optimize the power consumption of existing algorithms. For each algorithm we propose, we compare it to existing algorithms that are either probabilistic or context-aware, and then we evaluate their performance based on simulations. We demonstrate that in some cases, context-aware information, such as location or signal-strength, can improve the effciency. In the practical realm, we propose a testbed framework, namely ManetLab, to implement and to deploy MANET-specific protocols, and to evaluate their performance. This testbed framework aims to increase the accuracy of performance evaluation compared to simulations, while keeping the ease of use offered by the simulators to reproduce a performance evaluation. By evaluating the performance of different probabilistic algorithms with ManetLab, we observe that both simulations and testbeds should be used in a complementary way. In addition to the above original contributions, we also provide two surveys about system-level support for ad hoc communications in order to establish a state of the art. The first is about existing broadcast algorithms and the second is about existing middleware solutions and the way they deal with privacy and especially with location privacy. - Un réseau mobile ad hoc (MANET) est un réseau avec une architecture décentralisée et sans infrastructure. Cette thèse vise à fournir un support adéquat, au niveau système, aux développeurs d'applications ou de protocoles dans de tels réseaux. Dans ce but, nous proposons des contributions à la fois dans le domaine de l'algorithmique et dans celui de la pratique. Nous contribuons au domaine algorithmique en proposant différents algorithmes de diffusion dans les MANETs, algorithmes qui sont sensibles au contexte, à savoir six-shot broadcast,six-shot multicast, PLAN-B ainsi qu'une approche générique permettant d'optimiser la consommation d'énergie de ces algorithmes. Pour chaque algorithme que nous proposons, nous le comparons à des algorithmes existants qui sont soit probabilistes, soit sensibles au contexte, puis nous évaluons leurs performances sur la base de simulations. Nous montrons que, dans certains cas, des informations liées au contexte, telles que la localisation ou l'intensité du signal, peuvent améliorer l'efficience de ces algorithmes. Sur le plan pratique, nous proposons une plateforme logicielle pour la création de bancs d'essai, intitulé ManetLab, permettant d'implémenter, et de déployer des protocoles spécifiques aux MANETs, de sorte à évaluer leur performance. Cet outil logiciel vise à accroître la précision desévaluations de performance comparativement à celles fournies par des simulations, tout en conservant la facilité d'utilisation offerte par les simulateurs pour reproduire uneévaluation de performance. En évaluant les performances de différents algorithmes probabilistes avec ManetLab, nous observons que simulateurs et bancs d'essai doivent être utilisés de manière complémentaire. En plus de ces contributions principales, nous fournissons également deux états de l'art au sujet du support nécessaire pour les communications ad hoc. Le premier porte sur les algorithmes de diffusion existants et le second sur les solutions de type middleware existantes et la façon dont elles traitent de la confidentialité, en particulier celle de la localisation.
Resumo:
A simple and sensitive liquid chromatography-electrospray ionization mass spectrometry method was developed for the simultaneous quantification in human plasma of all selective serotonin reuptake inhibitors (citalopram, fluoxetine, fluvoxamine, paroxetine and sertraline) and their main active metabolites (desmethyl-citalopram and norfluoxetine). A stable isotope-labeled internal standard was used for each analyte to compensate for the global method variability, including extraction and ionization variations. After sample (250μl) pre-treatment with acetonitrile (500μl) to precipitate proteins, a fast solid-phase extraction procedure was performed using mixed mode Oasis MCX 96-well plate. Chromatographic separation was achieved in less than 9.0min on a XBridge C18 column (2.1×100mm; 3.5μm) using a gradient of ammonium acetate (pH 8.1; 50mM) and acetonitrile as mobile phase at a flow rate of 0.3ml/min. The method was fully validated according to Société Française des Sciences et Techniques Pharmaceutiques protocols and the latest Food and Drug Administration guidelines. Six point calibration curves were used to cover a large concentration range of 1-500ng/ml for citalopram, desmethyl-citalopram, paroxetine and sertraline, 1-1000ng/ml for fluoxetine and fluvoxamine, and 2-1000ng/ml for norfluoxetine. Good quantitative performances were achieved in terms of trueness (84.2-109.6%), repeatability (0.9-14.6%) and intermediate precision (1.8-18.0%) in the entire assay range including the lower limit of quantification. Internal standard-normalized matrix effects were lower than 13%. The accuracy profiles (total error) were mainly included in the acceptance limits of ±30% for biological samples. The method was successfully applied for routine therapeutic drug monitoring of more than 1600 patient plasma samples over 9 months. The β-expectation tolerance intervals determined during the validation phase were coherent with the results of quality control samples analyzed during routine use. This method is therefore precise and suitable both for therapeutic drug monitoring and pharmacokinetic studies in most clinical laboratories.
Resumo:
In the context of demographic evolution, psychiatric care needs increase steadily in most western countries. Given the financial limitations, it is mandatory to establish appropriate care priorities in order to avoid psychiatric hospitalisations by assisting care providers, general practionners and nurses, at home or in the nursing homes. A crisis team has been established 18 months ago within the Division of old age psychiatry in Lausanne. The care program included immediate assistance in the community, assessement, crisis counseling, medication consultation and referral for psychiatric services providing an alternative to hospitalization. The first results indicate that this intervention is well accepted by the users and correspond to a real need.
Resumo:
This report documents Phase IV of the Highway Maintenance Concept Vehicle (HMCV) project, a pooled fund study sponsored by the Departments of Transportation of Iowa, Pennsylvania, and Wisconsin. This report provides the background, including a brief history of the earlier phases of the project, a systems overview, and descriptions of the research conducted in Phase IV. Finally, the report provides conclusions and recommendations for future research. Background The goal of the Highway Maintenance Concept Vehicle Pooled Fund Study is to provide travelers with the level of service defined by policy during the winter season at the least cost to taxpayers. This goal is to be accomplished by using information regarding actual road conditions to facilitate and adjust snow and ice control activities. The approach used in this study was to bring technology applications from other industries to the highway maintenance vehicle. This approach is evolutionary in that as emerging technologies and applications are found to be acceptable to the pooled fund states and as they appear that to have potential for supporting the study goals they become candidates for our research. The objective of Phase IV is to: Conduct limited deployment of selected technologies from Phase III by equipping a vehicle with proven advanced technologies and creating a mobile test laboratory for collecting road weather data. The research quickly pointed out that investments in winter storm maintenance assets must be based on benefit/cost analysis and related to improving level of service. For example, Iowa has estimated the average cost of fighting a winter storm to be about $60,000 to $70,000 per hour typically. The maintenance concept vehicle will have advanced technology equipment capable of applying precisely the correct amount of material, accurately tailored to the existing and predicted pavement conditions. Hence, a state using advanced technology could expect to have a noticeable impact on the average time taken to establish the winter driving service level. If the concept vehicle and data produced by the vehicle are used to support decision-making leading to reducing material usage and the average time by one hour, a reasonable benefit/cost will result. Data from the friction meter can be used to monitor and adjust snow and ice control activities and inform travelers of pavement surface conditions. Therefore, final selection of successfully performing technologies will be based on the foundation statements and criteria developed by the study team.
Resumo:
This report describes the results of the research project investigating the use of advanced field data acquisition technologies for lowa transponation agencies. The objectives of the research project were to (1) research and evaluate current data acquisition technologies for field data collection, manipulation, and reporting; (2) identify the current field data collection approach and the interest level in applying current technologies within Iowa transportation agencies; and (3) summarize findings, prioritize technology needs, and provide recommendations regarding suitable applications for future development. A steering committee consisting oretate, city, and county transportation officials provided guidance during this project. Technologies considered in this study included (1) data storage (bar coding, radio frequency identification, touch buttons, magnetic stripes, and video logging); (2) data recognition (voice recognition and optical character recognition); (3) field referencing systems (global positioning systems [GPS] and geographic information systems [GIs]); (4) data transmission (radio frequency data communications and electronic data interchange); and (5) portable computers (pen-based computers). The literature review revealed that many of these technologies could have useful applications in the transponation industry. A survey was developed to explain current data collection methods and identify the interest in using advanced field data collection technologies. Surveys were sent out to county and city engineers and state representatives responsible for certain programs (e.g., maintenance management and construction management). Results showed that almost all field data are collected using manual approaches and are hand-carried to the office where they are either entered into a computer or manually stored. A lack of standardization was apparent for the type of software applications used by each agency--even the types of forms used to manually collect data differed by agency. Furthermore, interest in using advanced field data collection technologies depended upon the technology, program (e.g.. pavement or sign management), and agency type (e.g., state, city, or county). The state and larger cities and counties seemed to be interested in using several of the technologies, whereas smaller agencies appeared to have very little interest in using advanced techniques to capture data. A more thorough analysis of the survey results is provided in the report. Recommendations are made to enhance the use of advanced field data acquisition technologies in Iowa transportation agencies: (1) Appoint a statewide task group to coordinate the effort to automate field data collection and reporting within the Iowa transportation agencies. Subgroups representing the cities, counties, and state should be formed with oversight provided by the statewide task group. (2) Educate employees so that they become familiar with the various field data acquisition technologies.
Resumo:
Constatant que la formation des enseignants aux technologies de l'information et de la communication (TIC) pose des problèmes spécifiques, cet article a pour but d'engager une réflexion sur ce thème. Nos réflexions s'appuient sur une expérience de formation réalisée dans 23 écoles professionnelles suisses engagées dans un programme national de soutien à l'utilisation pédagogique des TIC. L'article décrit la manière dont le dispositif de formation prévu a été mis en pratique, les difficultés auxquelles il a donné lieu et les réponses que les acteurs ont élaborées pour faire face à ces difficultés. Nous montrons qu'une analyse du déroulement effectif d'un dispositif de formation et des transformations qu'il subit constitue un élément essentiel pour orienter les futures actions de formation et penser la place des apprenants dans un dispositif de formation.
Resumo:
The primary objective of this research was to demonstrate the benefits of NDT technologies for effectively detecting and characterizing deterioration in bridge decks. In particular, the objectives were to demonstrate the capabilities of ground-penetrating radar (GPR) and impact echo (IE), and to evaluate and describe the condition of nine bridge decks proposed by Iowa DOT. The first part of the report provides a detailed review of the most important deterioration processes in concrete decks, followed by a discussion of the five NDT technologies utilized in this project. In addition to GPR and IE methods, three other technologies were utilized, namely: half-cell (HC) potential, electrical resistivity (ER), and ultrasonic surface waves (USW) method. The review includes a description of the principles of operation, field implementation, data analysis, and interpretation; information regarding their advantages and limitations in bridge deck evaluations and condition monitoring are also implicitly provided.. The second part of the report provides descriptions and bridge deck evaluation results from the nine bridges. The results of the NDT surveys are described in terms of condition assessment maps and are compared with the observations obtained from the recovered cores or conducted bridge deck rehabilitation. Results from this study confirm that the used technologies can provide detailed and accurate information about a certain type of deterioration, electrochemical environment, or defect. However, they also show that a comprehensive condition assessment of bridge decks can be achieved only through a complementary use of multiple technologies at this stage,. Recommendations are provided for the optimum implementation of NDT technologies for the condition assessment and monitoring of bridge decks.
Resumo:
OBJECTIVE: To develop and compare two new technologies for diagnosing a contiguous gene syndrome, the Williams-Beuren syndrome (WBS). METHODS: The first proposed method, named paralogous sequence quantification (PSQ), is based on the use of paralogous sequences located on different chromosomes and quantification of specific mismatches present at these loci using pyrosequencing technology. The second exploits quantitative real time polymerase chain reaction (QPCR) to assess the relative quantity of an analysed locus. RESULTS: A correct and unambiguous diagnosis was obtained for 100% of the analysed samples with either technique (n = 165 and n = 155, respectively). These methods allowed the identification of two patients with atypical deletions in a cohort of 182 WBS patients. Both patients presented with mild facial anomalies, mild mental retardation with impaired visuospatial cognition, supravalvar aortic stenosis, and normal growth indices. These observations are consistent with the involvement of GTF2IRD1 or GTF2I in some of the WBS facial features. CONCLUSIONS: Both PSQ and QPCR are robust, easy to interpret, and simple to set up. They represent a competitive alternative for the diagnosis of segmental aneuploidies in clinical laboratories. They have advantages over fluorescence in situ hybridisation or microsatellites/SNP genotyping for detecting short segmental aneuploidies as the former is costly and labour intensive while the latter depends on the informativeness of the polymorphisms.
Resumo:
The identity [r]evolution is happening. Who are you, who am I in the information society? In recent years, the convergence of several factors - technological, political, economic - has accelerated a fundamental change in our networked world. On a technological level, information becomes easier to gather, to store, to exchange and to process. The belief that more information brings more security has been a strong political driver to promote information gathering since September 11. Profiling intends to transform information into knowledge in order to anticipate one's behaviour, or needs, or preferences. It can lead to categorizations according to some specific risk criteria, for example, or to direct and personalized marketing. As a consequence, new forms of identities appear. They are not necessarily related to our names anymore. They are based on information, on traces that we leave when we act or interact, when we go somewhere or just stay in one place, or even sometimes when we make a choice. They are related to the SIM cards of our mobile phones, to our credit card numbers, to the pseudonyms that we use on the Internet, to our email addresses, to the IP addresses of our computers, to our profiles... Like traditional identities, these new forms of identities can allow us to distinguish an individual within a group of people, or describe this person as belonging to a community or a category. How far have we moved through this process? The identity [r]evolution is already becoming part of our daily lives. People are eager to share information with their "friends" in social networks like Facebook, in chat rooms, or in Second Life. Customers take advantage of the numerous bonus cards that are made available. Video surveillance is becoming the rule. In several countries, traditional ID documents are being replaced by biometric passports with RFID technologies. This raises several privacy issues and might actually even result in changing the perception of the concept of privacy itself, in particular by the younger generation. In the information society, our (partial) identities become the illusory masks that we choose -or that we are assigned- to interplay and communicate with each other. Rights, obligations, responsibilities, even reputation are increasingly associated with these masks. On the one hand, these masks become the key to access restricted information and to use services. On the other hand, in case of a fraud or negative reputation, the owner of such a mask can be penalized: doors remain closed, access to services is denied. Hence the current preoccupying growth of impersonation, identity-theft and other identity-related crimes. Where is the path of the identity [r]evolution leading us? The booklet is giving a glance on possible scenarios in the field of identity.