865 resultados para Privacy Based Access Control
Resumo:
OBJECTIVE: To identify specific major congenital malformations associated with use of carbamazepine in the first trimester of pregnancy. DESIGN: A review of all published cohort studies to identify key indications and a population based case-control study to test these indications. SETTING: Review of PubMed, Web of Science, and Embase for papers about carbamazepine exposure in the first trimester of pregnancy and specific malformations, and the EUROCAT Antiepileptic Study Database, including data from 19 European population based congenital anomaly registries, 1995-2005. PARTICIPANTS: The literature review covered eight cohort studies of 2680 pregnancies with carbamazepine monotherapy exposure, and the EUROCAT dataset included 98 075 registrations of malformations covering over 3.8 million births. MAIN OUTCOME MEASURES: Overall prevalence for a major congenital malformation after exposure to carbamazepine monotherapy in the first trimester. Odds ratios for malformations with exposure to carbamazepine among cases (five types of malformation identified in the literature review) compared with two groups of controls: other non-chromosomal registrations of malformations and chromosomal syndromes. RESULTS: The literature review yielded an overall prevalence for a major congenital malformation of 3.3% (95% confidence interval 2.7 to 4.2) after exposure to carbamazepine monotherapy in the first trimester. In 131 registrations of malformations, the fetus had been exposed to carbamazepine monotherapy. Spina bifida was the only specific major congenital malformation significantly associated with exposure to carbamazepine monotherapy (odds ratio 2.6 (95% confidence interval 1.2 to 5.3) compared with no antiepileptic drug), but the risk was smaller for carbamazepine than for valproic acid (0.2, 0.1 to 0.6). There was no evidence for an association with total anomalous pulmonary venous return (no cases with carbamazepine exposure), cleft lip (with or without palate) (0.2, 0.0 to 1.3), diaphragmatic hernia (0.9, 0.1 to 6.6), or hypospadias (0.7, 0.3 to 1.6) compared with no exposure to antiepileptic drugs. Further exploratory analysis suggested a higher risk of single ventricle and atrioventricular septal defect. CONCLUSION: Carbamazepine teratogenicity is relatively specific to spina bifida, though the risk is less than with valproic acid. Despite the large dataset, there was not enough power to detect moderate risks for some rare major congenital malformations.
Resumo:
Supported by IEEE 802.15.4 standardization activities, embedded networks have been gaining popularity in recent years. The focus of this paper is to quantify the behavior of key networking metrics of IEEE 802.15.4 beacon-enabled nodes under typical operating conditions, with the inclusion of packet retransmissions. We corrected and extended previous analyses by scrutinizing the assumptions on which the prevalent Markovian modeling is generally based. By means of a comparative study, we singled out which of the assumptions impact each of the performance metrics (throughput, delay, power consumption, collision probability, and packet-discard probability). In particular, we showed that - unlike what is usually assumed - the probability that a node senses the channel busy is not constant for all the stages of the backoff procedure and that these differences have a noticeable impact on backoff delay, packet-discard probability, and power consumption. Similarly, we showed that - again contrary to common assumption - the probability of obtaining transmission access to the channel depends on the number of nodes that is simultaneously sensing it. We evidenced that ignoring this dependence has a significant impact on the calculated values of throughput and collision probability. Circumventing these and other assumptions, we rigorously characterize, through a semianalytical approach, the key metrics in a beacon-enabled IEEE 802.15.4 system with retransmissions.
Resumo:
The Summit Lake Watershed Improvement Project is a watershed-based sediment control project designed to greatly reduce to nearly eliminate sedimentation of an existing lake that is being renovated for use as a water source in southern Iowa. Summit Lake is owned by the City of Creston and was once a water source lake until around 1984. The watershed improvements will include lakeshore stabilization and erosion control practices as a precursor for related improvements to the lake and overall 4,900-acre watershed. Best practices included in this phase are the implementation of riprap, a rain garden, grade stabilization structures, grassed waterways, terraces, basins, water use and access ordinances, education and outreach, water monitoring, and other stream bank improvements. These improvements, along with leveraged work to be done by strategic partners, will enable the lake to be used for local and regional water supplies by sustaining the lake for many years to come. Without the lake rehabilitation, the lake will likely be filled with sedimentation to the point that it will have no recreational value. Key partners are the City of Creston, IDNR, Southern Iowa Rural Water Association, Union County, the Union County NRCS office, Southwestern Community College, and the Summit Lake Association, which is a non-profit group of landowners working to protect the lake. The project will address WIRB targets: a) streambank stabilization, b) livestock runoff, c) agricultural runoff and drainage, d) stormwater runoff, and e) a section of inadequately sewered community.
Resumo:
This paper presents a control strategy for blood glucose(BG) level regulation in type 1 diabetic patients. To design the controller, model-based predictive control scheme has been applied to a newly developed diabetic patient model. The controller is provided with a feedforward loop to improve meal compensation, a gain-scheduling scheme to account for different BG levels, and an asymmetric cost function to reduce hypoglycemic risk. A simulation environment that has been approved for testing of artificial pancreas control algorithms has been used to test thecontroller. The simulation results show a good controller performance in fasting conditions and meal disturbance rejection, and robustness against model–patient mismatch and errors in mealestimation
Resumo:
Line converters have become an attractive AC/DC power conversion solution in industrial applications. Line converters are based on controllable semiconductor switches, typically insulated gate bipolar transistors. Compared to the traditional diode bridge-based power converters line converters have many advantageous characteristics, including bidirectional power flow, controllable de-link voltage and power factor and sinusoidal line current. This thesis considers the control of the lineconverter and its application to power quality improving. The line converter control system studied is based on the virtual flux linkage orientation and the direct torque control (DTC) principle. A new DTC-based current control scheme is introduced and analyzed. The overmodulation characteristics of the DTC converter are considered and an analytical equation for the maximum modulation index is derived. The integration of the active filtering features to the line converter isconsidered. Three different active filtering methods are implemented. A frequency-domain method, which is based on selective harmonic sequence elimination, anda time-domain method, which is effective in a wider frequency band, are used inharmonic current compensation. Also, a voltage feedback active filtering method, which mitigates harmonic sequences of the grid voltage, is implemented. The frequency-domain and the voltage feedback active filtering control systems are analyzed and controllers are designed. The designs are verified with practical measurements. The performance and the characteristics of the implemented active filtering methods are compared and the effect of the L- and the LCL-type line filteris discussed. The importance of the correct grid impedance estimate in the voltage feedback active filter control system is discussed and a new measurement-based method to obtain it is proposed. Also, a power conditioning system (PCS) application of the line converter is considered. A new method for correcting the voltage unbalance of the PCS-fed island network is proposed and experimentally validated.
Resumo:
Multicast is one method to transfer information in IPv4 based communication. Other methods are unicast and broadcast. Multicast is based on the group concept where data is sent from one point to a group of receivers and this remarkably saves bandwidth. Group members express an interest to receive data by using Internet Group Management Protocol and traffic is received by only those receivers who want it. The most common multicast applications are media streaming applications, surveillance applications and data collection applications. There are many data security methods to protect unicast communication that is the most common transfer method in Internet. Popular data security methods are encryption, authentication, access control and firewalls. The characteristics of multicast such as dynamic membership cause that all these data security mechanisms can not be used to protect multicast traffic. Nowadays the protection of multicast traffic is possible via traffic restrictions where traffic is allowed to propagate only to certain areas. One way to implement this is packet filters. Methods tested in this thesis are MVR, IGMP Filtering and access control lists which worked as supposed. These methods restrict the propagation of multicast but are laborious to configure in a large scale. There are also a few manufacturerspecific products that make possible to encrypt multicast traffic. These separate products are expensive and mainly intended to protect video transmissions via satellite. Investigation of multicast security has taken place for several years and the security methods that will be the results of the investigation are getting ready. An IETF working group called MSEC is standardizing these security methods. The target of this working group is to standardize data security protocols for multicast during 2004.
Resumo:
OBJECTIVE: To quantify the relation between body mass index (BMI) and endometrial cancer risk, and to describe the shape of such a relation. DESIGN: Pooled analysis of three hospital-based case-control studies. SETTING: Italy and Switzerland. POPULATION: A total of 1449 women with endometrial cancer and 3811 controls. METHODS: Multivariate odds ratios (OR) and 95% confidence intervals (95% CI) were obtained from logistic regression models. The shape of the relation was determined using a class of flexible regression models. MAIN OUTCOME MEASURE: The relation of BMI with endometrial cancer. RESULTS: Compared with women with BMI 18.5 to <25 kg/m(2) , the odds ratio was 5.73 (95% CI 4.28-7.68) for women with a BMI ≥35 kg/m(2) . The odds ratios were 1.10 (95% CI 1.09-1.12) and 1.63 (95% CI 1.52-1.75) respectively for an increment of BMI of 1 and 5 units. The relation was stronger in never-users of oral contraceptives (OR 3.35, 95% CI 2.78-4.03, for BMI ≥30 versus <25 kg/m(2) ) than in users (OR 1.22, 95% CI 0.56-2.67), and in women with diabetes (OR 8.10, 95% CI 4.10-16.01, for BMI ≥30 versus <25 kg/m(2) ) than in those without diabetes (OR 2.95, 95% CI 2.44-3.56). The relation was best fitted by a cubic model, although after the exclusion of the 5% upper and lower tails, it was best fitted by a linear model. CONCLUSIONS: The results of this study confirm a role of elevated BMI in the aetiology of endometrial cancer and suggest that the risk in obese women increases in a cubic nonlinear fashion. The relation was stronger in never-users of oral contraceptives and in women with diabetes. TWEETABLE ABSTRACT: Risk of endometrial cancer increases with elevated body weight in a cubic nonlinear fashion.
Resumo:
Genome-wide linkage studies have identified the 9q22 chromosomal region as linked with colorectal cancer (CRC) predisposition. A candidate gene in this region is transforming growth factor beta receptor 1 (TGFBR1). Investigation of TGFBR1 has focused on the common genetic variant rs11466445, a short exonic deletion of nine base pairs which results in truncation of a stretch of nine alanine residues to six alanine residues in the gene product. While the six alanine (*6A) allele has been reported to be associated with increased risk of CRC in some population based study groups this association remains the subject of robust debate. To date, reports have been limited to population-based case-control association studies, or case-control studies of CRC families selecting one affected individual per family. No study has yet taken advantage of all the genetic information provided by multiplex CRC families. Methods: We have tested for an association between rs11466445 and risk of CRC using several family-based statistical tests in a new study group comprising members of non-syndromic high risk CRC families sourced from three familial cancer centres, two in Australia and one in Spain. Results: We report a finding of a nominally significant result using the pedigree-based association test approach (PBAT; p = 0.028), while other family-based tests were non-significant, but with a p-value < 0.10 in each instance. These other tests included the Generalised Disequilibrium Test (GDT; p = 0.085), parent of origin GDT Generalised Disequilibrium Test (GDT-PO; p = 0.081) and empirical Family-Based Association Test (FBAT; p = 0.096, additive model). Related-person case-control testing using the 'More Powerful' Quasi-Likelihood Score Test did not provide any evidence for association (M-QL5; p = 0.41). Conclusions: After conservatively taking into account considerations for multiple hypothesis testing, we find little evidence for an association between the TGFBR1*6A allele and CRC risk in these families. The weak support for an increase in risk in CRC predisposed families is in agreement with recent meta-analyses of case-control studies, which estimate only a modest increase in sporadic CRC risk among 6*A allele carriers.
Resumo:
Contactless integrated circuit cards are one form of application of radio frequency identification. They are used in applications such as access control, identification, and payment in public transport. The contactless IC cards are passive which means that both the data and the energy are transferred to the card without contact using inductive coupling. Antenna design and optimization of the design for contactless IC cards defined by ISO/IEC14443 is studied. The basic operation principles of contactless system are presented and the structure of contactless IC card is illustrated. The structure was divided between the contactless chip and the antenna. The operation of the antenna was covered in depth and the parameters affecting to the performance of the antenna were presented. Also the different antenna technologies and connection technologies were provided. The antenna design process with the parameters and the design tools isillustrated and optimization of the design is studied. To make the design process more ideal a target of development was discovered, which was the implementation of test application. The optimization of the antenna design was presented based on the optimization criteria defined in this study. The solution for the implementation of these criteria and the effect of each criterion was found. For enhancing the performance of the antenna a focus for future study was proposed.
Resumo:
One of the main challenges in Software Engineering is to cope with the transition from an industry based on software as a product to software as a service. The field of Software Engineering should provide the necessary methods and tools to develop and deploy new cost-efficient and scalable digital services. In this thesis, we focus on deployment platforms to ensure cost-efficient scalability of multi-tier web applications and on-demand video transcoding service for different types of load conditions. Infrastructure as a Service (IaaS) clouds provide Virtual Machines (VMs) under the pay-per-use business model. Dynamically provisioning VMs on demand allows service providers to cope with fluctuations on the number of service users. However, VM provisioning must be done carefully, because over-provisioning results in an increased operational cost, while underprovisioning leads to a subpar service. Therefore, our main focus in this thesis is on cost-efficient VM provisioning for multi-tier web applications and on-demand video transcoding. Moreover, to prevent provisioned VMs from becoming overloaded, we augment VM provisioning with an admission control mechanism. Similarly, to ensure efficient use of provisioned VMs, web applications on the under-utilized VMs are consolidated periodically. Thus, the main problem that we address is cost-efficient VM provisioning augmented with server consolidation and admission control on the provisioned VMs. We seek solutions for two types of applications: multi-tier web applications that follow the request-response paradigm and on-demand video transcoding that is based on video streams with soft realtime constraints. Our first contribution is a cost-efficient VM provisioning approach for multi-tier web applications. The proposed approach comprises two subapproaches: a reactive VM provisioning approach called ARVUE and a hybrid reactive-proactive VM provisioning approach called Cost-efficient Resource Allocation for Multiple web applications with Proactive scaling. Our second contribution is a prediction-based VM provisioning approach for on-demand video transcoding in the cloud. Moreover, to prevent virtualized servers from becoming overloaded, the proposed VM provisioning approaches are augmented with admission control approaches. Therefore, our third contribution is a session-based admission control approach for multi-tier web applications called adaptive Admission Control for Virtualized Application Servers. Similarly, the fourth contribution in this thesis is a stream-based admission control and scheduling approach for on-demand video transcoding called Stream-Based Admission Control and Scheduling. Our fifth contribution is a computation and storage trade-o strategy for cost-efficient video transcoding in cloud computing. Finally, the sixth and the last contribution is a web application consolidation approach, which uses Ant Colony System to minimize the under-utilization of the virtualized application servers.
Resumo:
The future of privacy in the information age is a highly debated topic. In particular, new and emerging technologies such as ICTs and cognitive technologies are seen as threats to privacy. This thesis explores images of the future of privacy among non-experts within the time frame from the present until the year 2050. The aims of the study are to conceptualise privacy as a social and dynamic phenomenon, to understand how privacy is conceptualised among citizens and to analyse ideal-typical images of the future of privacy using the causal layered analysis method. The theoretical background of the thesis combines critical futures studies and critical realism, and the empirical material is drawn from three focus group sessions held in spring 2012 as part of the PRACTIS project. From a critical realist perspective, privacy is conceptualised as a social institution which creates and maintains boundaries between normative circles and preserves the social freedom of individuals. Privacy changes when actors with particular interests engage in technology-enabled practices which challenge current privacy norms. The thesis adopts a position of technological realism as opposed to determinism or neutralism. In the empirical part, the focus group participants are divided into four clusters based on differences in privacy conceptions and perceived threats and solutions. The clusters are fundamentalists, pragmatists, individualists and collectivists. Correspondingly, four ideal-typical images of the future are composed: ‘drift to low privacy’, ‘continuity and benign evolution’, ‘privatised privacy and an uncertain future’, and ‘responsible future or moral decline’. The images are analysed using the four layers of causal layered analysis: litany, system, worldview and myth. Each image has its strengths and weaknesses. The individualistic images tend to be fatalistic in character while the collectivistic images are somewhat utopian. In addition, the images have two common weaknesses: lack of recognition of ongoing developments and simplistic conceptions of privacy based on a dichotomy between the individual and society. The thesis argues for a dialectical understanding of futures as present images of the future and as outcomes of real processes and mechanisms. The first steps in promoting desirable futures are the awareness of privacy as a social institution, the awareness of current images of the future, including their assumptions and weaknesses, and an attitude of responsibility where futures are seen as the consequences of present choices.
Resumo:
De nos jours, la voiture est devenue le mode de transport le plus utilisé, mais malheureusement, il est accompagné d’un certain nombre de problèmes (accidents, pollution, embouteillages, etc.), qui vont aller en s’aggravant avec l’augmentation prévue du nombre de voitures particulières, malgré les efforts très importants mis en œuvre pour tenter de les réduire ; le nombre de morts sur les routes demeure très important. Les réseaux sans fil de véhicules, appelés VANET, qui consistent de plusieurs véhicules mobiles sans infrastructure préexistante pour communiquer, font actuellement l’objet d'une attention accrue de la part des constructeurs et des chercheurs, afin d’améliorer la sécurité sur les routes ou encore les aides proposées aux conducteurs. Par exemple, ils peuvent avertir d’autres automobilistes que les routes sont glissantes ou qu’un accident vient de se produire. Dans VANET, les protocoles de diffusion (broadcast) jouent un rôle très important par rapport aux messages unicast, car ils sont conçus pour transmettre des messages de sécurité importants à tous les nœuds. Ces protocoles de diffusion ne sont pas fiables et ils souffrent de plusieurs problèmes, à savoir : (1) Tempête de diffusion (broadcast storm) ; (2) Nœud caché (hidden node) ; (3) Échec de la transmission. Ces problèmes doivent être résolus afin de fournir une diffusion fiable et rapide. L’objectif de notre recherche est de résoudre certains de ces problèmes, tout en assurant le meilleur compromis entre fiabilité, délai garanti, et débit garanti (Qualité de Service : QdS). Le travail de recherche de ce mémoire a porté sur le développement d’une nouvelle technique qui peut être utilisée pour gérer le droit d’accès aux médias (protocole de gestion des émissions), la gestion de grappe (cluster) et la communication. Ce protocole intègre l'approche de gestion centralisée des grappes stables et la transmission des données. Dans cette technique, le temps est divisé en cycles, chaque cycle est partagé entre les canaux de service et de contrôle, et divisé en deux parties. La première partie s’appuie sur TDMA (Time Division Multiple Access). La deuxième partie s’appuie sur CSMA/CA (Carrier Sense Multiple Access / Collision Avoidance) pour gérer l’accès au medium. En outre, notre protocole ajuste d’une manière adaptative le temps consommé dans la diffusion des messages de sécurité, ce qui permettra une amélioration de la capacité des canaux. Il est implanté dans la couche MAC (Medium Access Control), centralisé dans les têtes de grappes (CH, cluster-head) qui s’adaptent continuellement à la dynamique des véhicules. Ainsi, l’utilisation de ce protocole centralisé nous assure une consommation efficace d’intervalles de temps pour le nombre exact de véhicules actifs, y compris les nœuds/véhicules cachés; notre protocole assure également un délai limité pour les applications de sécurité, afin d’accéder au canal de communication, et il permet aussi de réduire le surplus (overhead) à l’aide d’une propagation dirigée de diffusion.
Resumo:
The application of computer vision based quality control has been slowly but steadily gaining importance mainly due to its speed in achieving results and also greatly due to its non- destnictive nature of testing. Besides, in food applications it also does not contribute to contamination. However, computer vision applications in quality control needs the application of an appropriate software for image analysis. Eventhough computer vision based quality control has several advantages, its application has limitations as to the type of work to be done, particularly so in the food industries. Selective applications, however, can be highly advantageous and very accurate.Computer vision based image analysis could be used in morphometric measurements of fish with the same accuracy as the existing conventional method. The method is non-destructive and non-contaminating thus providing anadvantage in seafood processing.The images could be stored in archives and retrieved at anytime to carry out morphometric studies for biologists.Computer vision and subsequent image analysis could be used in measurements of various food products to assess uniformity of size. One product namely cutlet and product ingredients namely coating materials such as bread crumbs and rava were selected for the study. Computer vision based image analysis was used in the measurements of length, width and area of cutlets. Also the width of coating materials like bread crumbs was measured.Computer imaging and subsequent image analysis can be very effectively used in quality evaluations of product ingredients in food processing. Measurement of width of coating materials could establish uniformity of particles or the lack of it. The application of image analysis in bacteriological work was also done
Resumo:
In dieser Arbeit wird ein generisches Modell fuer synchrone Gruppenarbeit auf gemeinsamen Informationsraeumen entwickelt. Fuer die Entwicklung dieses Modells muessen die Grundfunktionen fuer Anwendungen der synchronen Gruppenarbeit realisiert werden. Neben der Modellierung des Datenraumes (Datenmodell) und der operationellen Schnittstelle (Interaktionsmodell), muessen Mechanismen fuer die Darstellung der Aktivitaeten der Gruppenmitglieder auf dem Informationsraum (Awareness), sowie fuer die Synchronisierung gleichzeitiger Zugriffe verschiedener Benutzer auf dem Datenraum realisiert werden (Nebenlaeufgkeitskontrolle). Das Grundproblem bei der Loesung der Nebenlaeufigkeit liegt bei der Aufgabe der Isolation aus den klassischen ACID-Transaktionen zu gunsten von Awareness. Die rapide Entwicklung von Techniken der mobilen Kommunikation ermoeglicht den Einsatz dieser Geraete fuer den Zugriff auf Daten im Internet. Durch UMTSund WLAN-Technologien koennen Mobilgeraete fuer Anwendungen ueber die reine Kommunikation hinaus eingesetzt werden. Eine natuerliche Folge dieser Entwicklung sind Anwendungen fuer die Zusammenarbeit mehrerer Benutzer. In der Arbeit wird daher auf die Unterstuetzung mobiler Geraete besonderen Wert gelegt. Die Interaktion der Benutzer auf den gemeinsamen Datenraum wird durch einfache Navigationsoperationen mit einem Cursor (Finger) realisiert, wobei der Datenraum durch XML-Dokumente dargestellt wird. Die Visualisierung basiert auf der Transformierung von XML-Dokumenten in andere XML-basierte Sprachen wie HTML oder SVG durch XSLT-Stylesheets. Awareness-Informationen werden, aehnlich dem Fokus/Nimbus-Modell, von der Interaktion der Benutzer und der Ermittlung der sichtbaren Objekte bei dem Benutzer hergeleitet. Fuer eine geeignete Kontrolle der Nebenlaeufigkeit wurde der Begriff der visuellen Transaktion eingefuehrt, wo die Auswirkungen einer Transaktion von anderen Benutzern (Transaktionen) beobachtet werden koennen. Die Synchronisierung basiert auf einem Sperrverfahren und der Einfuehrung der neuen W-Sperre und der Grundoperationen readV und writeV. Das Modell (Groupware-Server) wird in der Arbeit in einem Prototyp implementiert. Weiterhin wird eine Java-Anwendung sowohl auf einem Desktop PC als auch auf einem Pocket PC (iPAQ 3970) implementiert, welche die Einsetzbarkeit dieses Prototyps demonstriert.