846 resultados para Distributed coding


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The human immunoglobulin lambda variable locus (IGLV) is mapped at chromosome 22 band q11.1-q11.2. The 30 functional germline v-lambda genes sequenced untill now have been subgrouped into 10 families (Vl1 to Vl10). The number of Vl genes has been estimated at approximately 70. This locus is formed by three gene clusters (VA, VB and VC) that encompass the variable coding genes (V) responsible for the synthesis of lambda-type Ig light chains, and the Jl-Cl cluster with the joining segments and the constant genes. Recently the entire variable lambda gene locus was mapped by contig methodology and its one- megabase DNA totally sequenced. All the known functional V-lambda genes and pseudogenes were located. We screened a human genomic DNA cosmid library and isolated a clone with an insert of 37 kb (cosmid 8.3) encompassing four functional genes (IGLV7S1, IGLV1S1, IGLV1S2 and IGLV5a), a pseudogene (VlA) and a vestigial sequence (vg1) to study in detail the positions of the restriction sites surrounding the Vl genes. We generated a high resolution restriction map, locating 31 restriction sites in 37 kb of the VB cluster, a region rich in functional Vl genes. This mapping information opens the perspective for further RFLP studies and sequencing

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nephrogenic diabetes insipidus (NDI) is a rare disease characterized by renal inability to respond properly to arginine vasopressin due to mutations in the vasopressin type 2 receptor (V2(R)) gene in affected kindreds. In most kindreds thus far reported, the mode of inheritance follows an X chromosome-linked recessive pattern although autosomal-dominant and autosomal-recessive modes of inheritance have also been described. Studies demonstrating mutations in the V2(R) gene in affected kindreds that modify the receptor structure, resulting in a dys- or nonfunctional receptor have been described, but phenotypically indistinguishable NDI patients with a structurally normal V2(R) gene have also been reported. In the present study, we analyzed exon 3 of the V2(R) gene in 20 unrelated individuals by direct sequencing. A C®T alteration in the third position of codon 331 (AGC®AGT), which did not alter the encoded amino acid, was found in nine individuals, including two unrelated patients with NDI. Taken together, these observations emphasize the molecular heterogeneity of a phenotypically homogeneous syndrome

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Distributed storage systems are studied. The interest in such system has become relatively wide due to the increasing amount of information needed to be stored in data centers or different kinds of cloud systems. There are many kinds of solutions for storing the information into distributed devices regarding the needs of the system designer. This thesis studies the questions of designing such storage systems and also fundamental limits of such systems. Namely, the subjects of interest of this thesis include heterogeneous distributed storage systems, distributed storage systems with the exact repair property, and locally repairable codes. For distributed storage systems with either functional or exact repair, capacity results are proved. In the case of locally repairable codes, the minimum distance is studied. Constructions for exact-repairing codes between minimum bandwidth regeneration (MBR) and minimum storage regeneration (MSR) points are given. These codes exceed the time-sharing line of the extremal points in many cases. Other properties of exact-regenerating codes are also studied. For the heterogeneous setup, the main result is that the capacity of such systems is always smaller than or equal to the capacity of a homogeneous system with symmetric repair with average node size and average repair bandwidth. A randomized construction for a locally repairable code with good minimum distance is given. It is shown that a random linear code of certain natural type has a good minimum distance with high probability. Other properties of locally repairable codes are also studied.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Measles virus is a highly contagious agent which causes a major health problem in developing countries. The viral genomic RNA is single-stranded, nonsegmented and of negative polarity. Many live attenuated vaccines for measles virus have been developed using either the prototype Edmonston strain or other locally isolated measles strains. Despite the diverse geographic origins of the vaccine viruses and the different attenuation methods used, there was remarkable sequence similarity of H, F and N genes among all vaccine strains. CAM-70 is a Japanese measles attenuated vaccine strain widely used in Brazilian children and produced by Bio-Manguinhos since 1982. Previous studies have characterized this vaccine biologically and genomically. Nevertheless, only the F, H and N genes have been sequenced. In the present study we have sequenced the remaining P, M and L genes (approximately 1.6, 1.4 and 6.5 kb, respectively) to complete the genomic characterization of CAM-70 and to assess the extent of genetic relationship between CAM-70 and other current vaccines. These genes were amplified using long-range or standard RT-PCR techniques, and the cDNA was cloned and automatically sequenced using the dideoxy chain-termination method. The sequence analysis comparing previously sequenced genotype A strains with the CAM-70 Bio-Manguinhos strain showed a low divergence among them. However, the CAM-70 strains (CAM-70 Bio-Manguinhos and a recently sequenced CAM-70 submaster seed strain) were assigned to a specific group by phylogenetic analysis using the neighbor-joining method. Information about our product at the genomic level is important for monitoring vaccination campaigns and for future studies of measles virus attenuation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our objective was to clone, express and characterize adult Dermatophagoides farinae group 1 (Der f 1) allergens to further produce recombinant allergens for future clinical applications in order to eliminate side reactions from crude extracts of mites. Based on GenBank data, we designed primers and amplified the cDNA fragment coding for Der f 1 by nested-PCR. After purification and recovery, the cDNA fragment was cloned into the pMD19-T vector. The fragment was then sequenced, subcloned into the plasmid pET28a(+), expressed in Escherichia coli BL21 and identified by Western blotting. The cDNA coding for Der f 1 was cloned, sequenced and expressed successfully. Sequence analysis showed the presence of an open reading frame containing 966 bp that encodes a protein of 321 amino acids. Interestingly, homology analysis showed that the Der p 1 shared more than 87% identity in amino acid sequence with Eur m 1 but only 80% with Der f 1. Furthermore, phylogenetic analyses suggested that D. pteronyssinus was evolutionarily closer to Euroglyphus maynei than to D. farinae, even though D. pteronyssinus and D. farinae belong to the same Dermatophagoides genus. A total of three cysteine peptidase active sites were found in the predicted amino acid sequence, including 127-138 (QGGCGSCWAFSG), 267-277 (NYHAVNIVGYG) and 284-303 (YWIVRNSWDTTWGDSGYGYF). Moreover, secondary structure analysis revealed that Der f 1 contained an a helix (33.96%), an extended strand (17.13%), a ß turn (5.61%), and a random coil (43.30%). A simple three-dimensional model of this protein was constructed using a Swiss-model server. The cDNA coding for Der f 1 was cloned, sequenced and expressed successfully. Alignment and phylogenetic analysis suggests that D. pteronyssinus is evolutionarily more similar to E. maynei than to D. farinae.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the new age of Internet of Things (IoT), object of everyday such as mobile smart devices start to be equipped with cheap sensors and low energy wireless communication capability. Nowadays mobile smart devices (phones, tablets) have become an ubiquitous device with everyone having access to at least one device. There is an opportunity to build innovative applications and services by exploiting these devices’ untapped rechargeable energy, sensing and processing capabilities. In this thesis, we propose, develop, implement and evaluate LoadIoT a peer-to-peer load balancing scheme that can distribute tasks among plethora of mobile smart devices in the IoT world. We develop and demonstrate an android-based proof of concept load-balancing application. We also present a model of the system which is used to validate the efficiency of the load balancing approach under varying application scenarios. Load balancing concepts can be apply to IoT scenario linked to smart devices. It is able to reduce the traffic send to the Cloud and the energy consumption of the devices. The data acquired from the experimental outcomes enable us to determine the feasibility and cost-effectiveness of a load balanced P2P smart phone-based applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Resilience is the property of a system to remain trustworthy despite changes. Changes of a different nature, whether due to failures of system components or varying operational conditions, significantly increase the complexity of system development. Therefore, advanced development technologies are required to build robust and flexible system architectures capable of adapting to such changes. Moreover, powerful quantitative techniques are needed to assess the impact of these changes on various system characteristics. Architectural flexibility is achieved by embedding into the system design the mechanisms for identifying changes and reacting on them. Hence a resilient system should have both advanced monitoring and error detection capabilities to recognise changes as well as sophisticated reconfiguration mechanisms to adapt to them. The aim of such reconfiguration is to ensure that the system stays operational, i.e., remains capable of achieving its goals. Design, verification and assessment of the system reconfiguration mechanisms is a challenging and error prone engineering task. In this thesis, we propose and validate a formal framework for development and assessment of resilient systems. Such a framework provides us with the means to specify and verify complex component interactions, model their cooperative behaviour in achieving system goals, and analyse the chosen reconfiguration strategies. Due to the variety of properties to be analysed, such a framework should have an integrated nature. To ensure the system functional correctness, it should rely on formal modelling and verification, while, to assess the impact of changes on such properties as performance and reliability, it should be combined with quantitative analysis. To ensure scalability of the proposed framework, we choose Event-B as the basis for reasoning about functional correctness. Event-B is a statebased formal approach that promotes the correct-by-construction development paradigm and formal verification by theorem proving. Event-B has a mature industrial-strength tool support { the Rodin platform. Proof-based verification as well as the reliance on abstraction and decomposition adopted in Event-B provides the designers with a powerful support for the development of complex systems. Moreover, the top-down system development by refinement allows the developers to explicitly express and verify critical system-level properties. Besides ensuring functional correctness, to achieve resilience we also need to analyse a number of non-functional characteristics, such as reliability and performance. Therefore, in this thesis we also demonstrate how formal development in Event-B can be combined with quantitative analysis. Namely, we experiment with integration of such techniques as probabilistic model checking in PRISM and discrete-event simulation in SimPy with formal development in Event-B. Such an integration allows us to assess how changes and di erent recon guration strategies a ect the overall system resilience. The approach proposed in this thesis is validated by a number of case studies from such areas as robotics, space, healthcare and cloud domain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The goal of this thesis is to define and validate a software engineering approach for the development of a distributed system for the modeling of composite materials, based on the analysis of various existing software development methods. We reviewed the main features of: (1) software engineering methodologies; (2) distributed system characteristics and their effect on software development; (3) composite materials modeling activities and the requirements for the software development. Using the design science as a research methodology, the distributed system for creating models of composite materials is created and evaluated. Empirical experiments which we conducted showed good convergence of modeled and real processes. During the study, we paid attention to the matter of complexity and importance of distributed system and a deep understanding of modern software engineering methods and tools.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Liberalization of electricity markets has resulted in a competed Nordic electricity market, in which electricity retailers play a key role as electricity suppliers, market intermediaries, and service providers. Although these roles may remain unchanged in the near future, the retailers’ operation may change fundamentally as a result of the emerging smart grid environment. Especially the increasing amount of distributed energy resources (DER), and improving opportunities for their control, are reshaping the operating environment of the retailers. This requires that the retailers’ operation models are developed to match the operating environment, in which the active use of DER plays a major role. Electricity retailers have a clientele, and they operate actively in the electricity markets, which makes them a natural market party to offer new services for end-users aiming at an efficient and market-based use of DER. From the retailer’s point of view, the active use of DER can provide means to adapt the operation to meet the challenges posed by the smart grid environment, and to pursue the ultimate objective of the retailer, which is to maximize the profit of operation. This doctoral dissertation introduces a methodology for the comprehensive use of DER in an electricity retailer’s short-term profit optimization that covers operation in a variety of marketplaces including day-ahead, intra-day, and reserve markets. The analysis results provide data of the key profit-making opportunities and the risks associated with different types of DER use. Therefore, the methodology may serve as an efficient tool for an experienced operator in the planning of the optimal market-based DER use. The key contributions of this doctoral dissertation lie in the analysis and development of the model that allows the retailer to benefit from profit-making opportunities brought by the use of DER in different marketplaces, but also to manage the major risks involved in the active use of DER. In addition, the dissertation introduces an analysis of the economic potential of DER control actions in different marketplaces including the day-ahead Elspot market, balancing power market, and the hourly market of Frequency Containment Reserve for Disturbances (FCR-D).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditional psychometric theory and practice classify people according to broad ability dimensions but do not examine how these mental processes occur. Hunt and Lansman (1975) proposed a 'distributed memory' model of cognitive processes with emphasis on how to describe individual differences based on the assumption that each individual possesses the same components. It is in the quality of these components ~hat individual differences arise. Carroll (1974) expands Hunt's model to include a production system (after Newell and Simon, 1973) and a response system. He developed a framework of factor analytic (FA) factors for : the purpose of describing how individual differences may arise from them. This scheme is to be used in the analysis of psychometric tes ts . Recent advances in the field of information processing are examined and include. 1) Hunt's development of differences between subjects designated as high or low verbal , 2) Miller's pursuit of the magic number seven, plus or minus two, 3) Ferguson's examination of transfer and abilities and, 4) Brown's discoveries concerning strategy teaching and retardates . In order to examine possible sources of individual differences arising from cognitive tasks, traditional psychometric tests were searched for a suitable perceptual task which could be varied slightly and administered to gauge learning effects produced by controlling independent variables. It also had to be suitable for analysis using Carroll's f ramework . The Coding Task (a symbol substitution test) found i n the Performance Scale of the WISe was chosen. Two experiments were devised to test the following hypotheses. 1) High verbals should be able to complete significantly more items on the Symbol Substitution Task than low verbals (Hunt, Lansman, 1975). 2) Having previous practice on a task, where strategies involved in the task may be identified, increases the amount of output on a similar task (Carroll, 1974). J) There should be a sUbstantial decrease in the amount of output as the load on STM is increased (Miller, 1956) . 4) Repeated measures should produce an increase in output over trials and where individual differences in previously acquired abilities are involved, these should differentiate individuals over trials (Ferguson, 1956). S) Teaching slow learners a rehearsal strategy would improve their learning such that their learning would resemble that of normals on the ,:same task. (Brown, 1974). In the first experiment 60 subjects were d.ivided·into high and low verbal, further divided randomly into a practice group and nonpractice group. Five subjects in each group were assigned randomly to work on a five, seven and nine digit code throughout the experiment. The practice group was given three trials of two minutes each on the practice code (designed to eliminate transfer effects due to symbol similarity) and then three trials of two minutes each on the actual SST task . The nonpractice group was given three trials of two minutes each on the same actual SST task . Results were analyzed using a four-way analysis of variance . In the second experiment 18 slow learners were divided randomly into two groups. one group receiving a planned strategy practioe, the other receiving random practice. Both groups worked on the actual code to be used later in the actual task. Within each group subjects were randomly assigned to work on a five, seven or nine digit code throughout. Both practice and actual tests consisted on three trials of two minutes each. Results were analyzed using a three-way analysis of variance . It was found in t he first experiment that 1) high or low verbal ability by itself did not produce significantly different results. However, when in interaction with the other independent variables, a difference in performance was noted . 2) The previous practice variable was significant over all segments of the experiment. Those who received previo.us practice were able to score significantly higher than those without it. J) Increasing the size of the load on STM severely restricts performance. 4) The effect of repeated trials proved to be beneficial. Generally, gains were made on each successive trial within each group. S) In the second experiment, slow learners who were allowed to practice randomly performed better on the actual task than subjeots who were taught the code by means of a planned strategy. Upon analysis using the Carroll scheme, individual differences were noted in the ability to develop strategies of storing, searching and retrieving items from STM, and in adopting necessary rehearsals for retention in STM. While these strategies may benef it some it was found that for others they may be harmful . Temporal aspects and perceptual speed were also found to be sources of variance within individuals . Generally it was found that the largest single factor i nfluencing learning on this task was the repeated measures . What e~ables gains to be made, varies with individuals . There are environmental factors, specific abilities, strategy development, previous learning, amount of load on STM , perceptual and temporal parameters which influence learning and these have serious implications for educational programs .

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present set of experiments was designed to investigate the organization and refmement of young children's face space. Past research has demonstrated that adults encode individual faces in reference to a distinct face prototype that represents the average of all faces ever encountered. The prototype is not a static abstracted norm but rather a malleable face average that is continuously updated by experience (Valentine, 1991); for example, following prolonged viewing of faces with compressed features (a technique referred to as adaptation), adults rate similarly distorted faces as more normal and more attractive (simple attractiveness aftereffects). Recent studies have shown that adults possess category-specific face prototypes (e.g., based on race, sex). After viewing faces from two categories (e.g., Caucasian/Chinese) that are distorted in opposite directions, adults' attractiveness ratings simultaneously shift in opposite directions (opposing aftereffects). The current series of studies used a child-friendly method to examine whether, like adults, 5- and 8-year-old children show evidence for category-contingent opposing aftereffects. Participants were shown a computerized storybook in which Caucasian and Chinese children's faces were distorted in opposite directions (expanded and compressed). Both before and after adaptation (i.e., reading the storybook), participants judged the normality/attractiveness of a small number of expanded, compressed, and undistorted Caucasian and Chinese faces. The method was first validated by testing adults (Experiment I ) and was then refined in order to test 8- (Experiment 2) and 5-yearold (Experiment 4a) children. Five-year-olds (our youngest age group) were also tested in a simple aftereffects paradigm (Experiment 3) and with male and female faces distorted in opposite directions (Experiment 4b). The current research is the first to demonstrate evidence for simple attractiveness aftereffects in children as young as 5, thereby indicating that similar to adults, 5-year-olds utilize norm-based coding. Furthermore, this research provides evidence for racecontingent opposing aftereffects in both 5- and 8-year-olds; however, the opposing aftereffects demonstrated by 5-year-olds were driven largely by simple aftereffects for Caucasian faces. The lack of simple aftereffects for Chinese faces in 5-year-olds may be reflective of young children's limited experience with other-race faces and suggests that children's face space undergoes a period of increasing differentiation over time with respect to race. Lastly, we found no evidence for sex -contingent opposing aftereffects in 5-year-olds, which suggests that young children do not rely on a fully adult-like face space even for highly salient face categories (i.e., male/female) with which they have comparable levels of experience.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tesis (Maestría en Ciencias con Orientación en Matemáticas) UANL, 2013.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Affiliation: Svetlana Shumikhina &Stéphane Molotchnikoff : Département de Sciences Biologiques, Université de Montréal