973 resultados para Network (Re) Organization
Resumo:
Background: Beryllium (Be) is increasingly used worldwide for numerous industrial applications. Occupational exposure to Be may lead to Be sensitization (BeS), a CD4-mediated immune response. BeS may progress to chronic beryllium disease (CBD), a granulomatous lung disorder closely resembling sarcoidosis. The recognition of CBD requires detection of Be exposure at occupational history, and detection of BeS on blood or BAL lymphocytes. Since methods for CBD detection are not routinely available in Switzerland, we hypothesized that CBD cases are not recognized but misdiagnosis as sarcoidosis. Objective: To present an ongoing Swiss study screening patients with sarcoidosis in search of Be exposure, BeS, and CBD. Methods: Both a prospective and a retrospective cohort are being studied. In the prospective cohort, the main steps include: 1) recruitment of 100 consecutive patients with newly diagnosed pulmonary sarcoidosis at 2 centers (Lausanne, Bern). 2) screening for possible occupational Be exposure by self-administered patient questionnaire. 3) standardized detailed occupational interview and clinical visit by occupational health specialist. If step 3 is positive, then 4) blood and BAL sampling for detection of BeS by specifically developed Elispot assay and CFSE flow cytometry, with subsequent comparison to the classical Be lymphocyte proliferation test. If step 4 is positive, then 5) review of medical records and diagnostic revision from sarcoidosis to CBD. 6) appropriate measures for exposure cessation and case reporting to SUVA as occupational disease. The retrospective cohort will include 400 patients with previously diagnosed pulmonary sarcoidosis, either treated or untreated, recruited through the SIOLD Registries. Steps 2 to 5 will be peformed as above, except for a) end of study after step 2 if screening questionnaire does not reveal Be exposure, and b) step 4 done on blood sample only (BAL not needed). Current status: Self-administered screening questionnaire and tools for standardized occupational interview have been developed. BeS testing has been implemented and undergoes validation. Inclusions in the prospective phase have started at both study sites. The retrospective phase is in preparation. Conclusion: The current study status allows to conclude to technical feasibility of the project. The prospective phase if this study is funded by the SUVA. The SIOLD Registries are supported by the Swiss Pulmonary League.
Resumo:
We analyse both theoretically and empirically, the factors that influence the amount of humanitarian aid which countries receive when they are struck by natural disasters. Our investigation particularly distinguishes between immediate disaster relief which helps the survival of victims and long term humanitarian aid given towards reconstruction and rehabilitation. The theoretical model is able to make predictions as well as explain some of the peculiarities in the empirical results. The empirical analysis, making use of some useful data sources, show that both short and long term humanitarian aid increase with number of people killed, financial loss and level of corruption, while GDP per capita has no effect. Number of people affected had no effect on short term aid, but significantly increased long term aid. Both types of aid increased if the natural disaster was an earthquake, tsunami or drought. In addition, short term aid increases in response to a flood while long term aid increases in response to storms.
Resumo:
Starting from the observation that ghosts are strikingly recurrent and prominent figures in late-twentieth African diasporic literature, this dissertation proposes to account for this presence by exploring its various functions. It argues that, beyond the poetic function the ghost performs as metaphor, it also does cultural, theoretical and political work that is significant to the African diaspora in its dealings with issues of history, memory and identity. Toni Morrison's Beloved (1987) serves as a guide for introducing the many forms, qualities and significations of the ghost, which are then explored and analyzed in four chapters that look at Fred D'Aguiar's Feeding the Ghosts (1998), Gloria Naylor's Mama Day (1988), Paule Marshall's Praisesong for the Widow (1983) and a selection of novels, short stories and poetry by Michelle Cliff. Moving thematically through these texts, the discussion shifts from history through memory to identity as it examines how the ghost trope allows the writers to revisit sites of trauma; revise historical narratives that are constituted and perpetuated by exclusions and invisibilities; creatively and critically repossess a past marked by violence, dislocation and alienation and reclaim the diasporic culture it contributed to shaping; destabilize and deconstruct the hegemonic, normative categories and boundaries that delimit race or sexuality and envision other, less limited and limiting definitions of identity. These diverse and interrelated concerns are identified and theorized as participating in a project of "re-vision," a critical project that constitutes an epistemological as much as a political gesture. The author-based structure allows for a detailed analysis of the texts and highlights the distinctive shapes the ghost takes and the particular concerns it serves to address in each writer's literary and political project. However, using the ghost as a guide into these texts, taken collectively, also throws into relief new connections between them and sheds light on the complex ways in which the interplay of history, memory and identity positions them as products of and contributions to an African diasporic (literary) culture. If it insists on the cultural specificity of African diasporic ghosts, tracing its origins to African cultures and spiritualities, the argument also follows gothic studies' common view that ghosts in literary and cultural productions-like other related figures of the living dead-respond to particular conditions and anxieties. Considering the historical and political context in which the texts under study were produced, the dissertation makes connections between the ghosts in them and African diasporic people's disillusionment with the broken promises of the civil rights movement in the United States and of postcolonial independence in the Caribbean. It reads the texts' theoretical concerns and narrative qualities alongside the contestation of traditional historiography by black and postcolonial studies as well as the broader challenge to conventional notions such as truth, reality, meaning, power or identity by poststructuralism, postcolonialism or queer theory. Drawing on these various theoretical approaches and critical tools to elucidate the ghost's deconstructive power for African diasporic writers' concerns, this work ultimately offers a contribution to "speciality studies," which is currently emerging as a new field of scholarship in cultural theory.
Resumo:
Functional connectivity in human brain can be represented as a network using electroencephalography (EEG) signals. These networks--whose nodes can vary from tens to hundreds--are characterized by neurobiologically meaningful graph theory metrics. This study investigates the degree to which various graph metrics depend upon the network size. To this end, EEGs from 32 normal subjects were recorded and functional networks of three different sizes were extracted. A state-space based method was used to calculate cross-correlation matrices between different brain regions. These correlation matrices were used to construct binary adjacency connectomes, which were assessed with regards to a number of graph metrics such as clustering coefficient, modularity, efficiency, economic efficiency, and assortativity. We showed that the estimates of these metrics significantly differ depending on the network size. Larger networks had higher efficiency, higher assortativity and lower modularity compared to those with smaller size and the same density. These findings indicate that the network size should be considered in any comparison of networks across studies.
Resumo:
The mature ooxysts of six new species of Caryospora are described from the faeces of Brazilian snakes. They are differentiated from other species previously recorded from reptiles, largely on the size and shape of the oocyst and sporocyst, structure of the oocyst wall, and presence or absence of a polar body. C. paraensis n. sp., and C. carajasensis n. sp., are from the "false coral", Oxyrhopus petola digitalis; C. pseustesi n. sp., from the "egg-eater", Pseustes sulphureus sulphureus; C. epicratesi n. sp., from the "red boa", Epicrates cenchria cenchria; and C. micruri n. sp., and C. constancieae n. sp., from the "coral snake", Micrurus spixii spixii. A re-description is given of C. jararacae Carini, 1939, from the "jararaca" Bothrops atrox, embodying some additional morphological features.
Resumo:
PURPOSE: To better define outcome and prognostic factors in primary pineal tumors. MATERIALS AND METHODS: Thirty-five consecutive patients from seven academic centers of the Rare Cancer Network diagnosed between 1988 and 2006 were included. Median age was 36 years. Surgical resection consisted of biopsy in 12 cases and resection in 21 (2 cases with unknown resection). All patients underwent radiotherapy and 12 patients received also chemotherapy. RESULTS: Histological subtypes were pineoblastoma (PNB) in 21 patients, pineocytoma (PC) in 8 patients and pineocytoma with intermediate differentiation in 6 patients. Six patients with PNB had evidence of spinal seeding. Fifteen patients relapsed (14 PNB and 1 PC) with PNB cases at higher risk (p = 0.031). Median survival time was not reached. Median disease-free survival was 82 months (CI 50 % 28-275). In univariate analysis, age younger than 36 years was an unfavorable prognostic factor (p = 0.003). Patients with metastases at diagnosis had poorer survival (p = 0.048). Late side effects related to radiotherapy were dementia, leukoencephalopathy or memory loss in seven cases, occipital ischemia in one, and grade 3 seizures in two cases. Side effects related to chemotherapy were grade 3-4 leucopenia in five cases, grade 4 thrombocytopenia in three cases, grade 2 anemia in two cases, grade 4 pancytopenia in one case, grade 4 vomiting in one case and renal failure in one case. CONCLUSIONS: Age and dissemination at diagnosis influenced survival in our series. The prevalence of chronic toxicity suggests that new adjuvant strategies are advisable.
Resumo:
The human auditory system is comprised of specialized but interacting anatomic and functional pathways encoding object, spatial, and temporal information. We review how learning-induced plasticity manifests along these pathways and to what extent there are common mechanisms subserving such plasticity. A first series of experiments establishes a temporal hierarchy along which sounds of objects are discriminated along basic to fine-grained categorical boundaries and learned representations. A widespread network of temporal and (pre)frontal brain regions contributes to object discrimination via recursive processing. Learning-induced plasticity typically manifested as repetition suppression within a common set of brain regions. A second series considered how the temporal sequence of sound sources is represented. We show that lateralized responsiveness during the initial encoding phase of pairs of auditory spatial stimuli is critical for their accurate ordered perception. Finally, we consider how spatial representations are formed and modified through training-induced learning. A population-based model of spatial processing is supported wherein temporal and parietal structures interact in the encoding of relative and absolute spatial information over the initial ∼300ms post-stimulus onset. Collectively, these data provide insights into the functional organization of human audition and open directions for new developments in targeted diagnostic and neurorehabilitation strategies.
Resumo:
Both experimental and clinical data show evidence of a correlation between elevated blood levels of carcinoembryonic antigen (CEA) and the development of liver metastases from colorectal carcinomas. However, a cause-effect relationship between these two observations has not been demonstrated. For this reason, we developed a new experimental model to evaluate the possible role of circulating CEA in the facilitation of liver metastases. A CEA-negative subclone from the human colon carcinoma cell line CO115 was transfected either with CEA-cDNA truncated at its 3' end by the deletion of 78 base pairs leading to the synthesis of a secreted form of CEA or with a full-length CEA-cDNA leading to the synthesis of the entire CEA molecule linked to the cell surface by a GPI anchor. Transfectants were selected either for their high CEA secretion (clone CO115-2C2 secreting up to 13 microg CEA per 10(6) cells within 72 h) or for their high CEA membrane expression (clone CO115-5F12 expressing up to 1 x 10(6) CEA molecules per cell). When grafted subcutaneously, CO115-2C2 cells gave rise to circulating CEA levels that were directly related to the tumour volume (from 100 to 1000 ng ml(-1) for tumours ranging from 100 to 1000 mm3), whereas no circulating CEA was detectable in CO115 and CO115-5F12 tumour-bearing mice. Three series of nude mice bearing a subcutaneous xenograft from either clone CO115-2C2 or the CO115-5F12 transfectant, or an untransfected CO115 xenograft, were further challenged for induction of experimental liver metastases by intrasplenic injection of three different CEA-expressing human colorectal carcinoma cell lines (LoVo, LS174T or CO112). The number and size of the liver metastases were shown to be independent of the circulating CEA levels induced by the subcutaneous CEA secreting clone (CO115-2C2), but they were directly related to the metastatic properties of the intrasplenically injected tumour cells.
Resumo:
Résumé La mondialisation des marchés, les mutations du contexte économique et enfin l'impact des nouvelles technologies de l'information ont obligé les entreprises à revoir la façon dont elles gèrent leurs capitaux intellectuel (gestion des connaissances) et humain (gestion des compétences). II est communément admis aujourd'hui que ceux-ci jouent un rôle particulièrement stratégique dans l'organisation. L'entreprise désireuse de se lancer dans une politique gestion de ces capitaux devra faire face à différents problèmes. En effet, afin de gérer ces connaissances et ces compétences, un long processus de capitalisation doit être réalisé. Celui-ci doit passer par différentes étapes comme l'identification, l'extraction et la représentation des connaissances et des compétences. Pour cela, il existe différentes méthodes de gestion des connaissances et des compétences comme MASK, CommonKADS, KOD... Malheureusement, ces différentes méthodes sont très lourdes à mettre en oeuvre, et se cantonnent à certains types de connaissances et sont, par conséquent, plus limitées dans les fonctionnalités qu'elles peuvent offrir. Enfin, la gestion des compétences et la gestion des connaissances sont deux domaines dissociés alors qu'il serait intéressant d'unifier ces deux approches en une seule. En effet, les compétences sont très proches des connaissances comme le souligne la définition de la compétence qui suit : « un ensemble de connaissances en action dans un contexte donné ». Par conséquent, nous avons choisi d'appuyer notre proposition sur le concept de compétence. En effet, la compétence est parmi les connaissances de l'entreprise l'une des plus cruciales, en particulier pour éviter la perte de savoir-faire ou pour pouvoir prévenir les besoins futurs de l'entreprise, car derrière les compétences des collaborateurs, se trouve l'efficacité de l'organisation. De plus, il est possible de décrire grâce à la compétence de nombreux autres concepts de l'organisation, comme les métiers, les missions, les projets, les formations... Malheureusement, il n'existe pas réellement de consensus sur la définition de la compétence. D'ailleurs, les différentes définitions existantes, même si elles sont pleinement satisfaisantes pour les experts, ne permettent pas de réaliser un système opérationnel. Dans notre approche; nous abordons la gestion des compétences à l'aide d'une méthode de gestion des connaissances. En effet, de par leur nature même, connaissance et compétence sont intimement liées et donc une telle méthode est parfaitement adaptée à la gestion des compétences. Afin de pouvoir exploiter ces connaissances et ces compétences nous avons dû, dans un premier temps, définir les concepts organisationnels de façon claire et computationnelle. Sur cette base, nous proposons une méthodologie de construction des différents référentiels d'entreprise (référentiel de compétences, des missions, des métiers...). Pour modéliser ces différents référentiels, nous avons choisi l'ontologie, car elle permet d'obtenir des définitions cohérentes et consensuelles aux concepts tout en supportant les diversités langagières. Ensuite, nous cartographions les connaissances de l'entreprise (formations, missions, métiers...) sur ces différentes ontologies afin de pouvoir les exploiter et les diffuser. Notre approche de la gestion des connaissances et de la gestion des compétences a permis la réalisation d'un outil offrant de nombreuses fonctionnalités comme la gestion des aires de mobilités, l'analyse stratégique, les annuaires ou encore la gestion des CV. Abstract The globalization of markets, the easing of economical regulation and finally the impact of new information and communication technologies have obliged firms to re-examine the way they manage their knowledge capital (knowledge management) and their human capital (competence management). It is commonly admitted that knowledge plays a slightly strategical role in the organization. The firms who want to establish one politic of management of these capitals will have to face with different problems. To manage that knowledge, a long process of capitalization must be done. That one has different steps like identification, extraction and representation of knowledge and competences. There are some different methods of knowledge management like MASK, CommonKADS or KOD. Unfortunately, those methods are very difficult to implement and are using only some types of knowledge and are consequently more limited in the functionalities they can offer. Knowledge management and competence management are two different domain where it could be interesting to unify those to one. Indeed, competence is very close than knowledge as underline this definition: "a set of knowledge in action in a specified context". We choose in our approach to rely on the concept of competence. Indeed, the competence is one of crucial knowledge in the company, particularly to avoid the loss of know-how or to prevent future needs. Because behind collaborator's competence, we can find company efficiency. Unfortunately, there is no real consensus on the definition of the concept of competence. Moreover, existing different definitions don't permit to develop an operational system. Among other key concept, we can find jobs, mission, project, and training... Moreover, we approach different problems of the competence management under the angle of the knowledge management. Indeed, knowledge and competence are closely linked. Then, we propose a method to build different company repositories (competence, jobs, projects repositories). To model those different repositories we choose ontology because it permits to obtain coherent and consensual definitions of the concepts with support of linguistics diversities too. This building repositories method coupled with this knowledge and competence management approach permitted the realization of a tool offering functionalities like mobility management, strategical analysis, yellow pages or CV management.
Resumo:
Treball de recerca realitzat per un alumne d'ensenyament secundari i guardonat amb un Premi CIRIT per fomentar l'esperit científic del Jovent l'any 2009. L’objectiu d’aquest treball de recerca és la creació d’un dispositiu encarregat de centralitzar totes les necessitats multimèdia de casa nostra i distribuir aquest contingut a tots els terminals de la xarxa local d’una manera senzilla i automatitzada. Aquest dispositiu s’ha dissenyat per estar connectat a una televisió d’alta definició, que permetrà la reproducció i l’organització de tot el nostre multimèdia d’una manera còmoda i fàcil. El media center s’encarrega de gestionar la nostra filmoteca, fototeca, biblioteca musical i sèries de TV de manera transparent i automàtica. A més a més, l’usuari pot accedir a tot el multimèdia emmagatzemat al media center des de qualsevol dispositiu de la xarxa local a través de protocols com CIFS o UPnP, en un intent de replicar el cloud computing a escala local. El dispositiu ha estat dissenyat per a suportar tot tipus de formats i subtítols, assegurant la compatibilitat total amb arxius lliures de DRM. El seu disseny minimalista i silenciós el fa perfecte per a substituir el reproductor de DVD de la sala. Tot això sense oblidar el seu baix consum, de l’ordre d’un 75% inferior al d’un PC convencional.
Resumo:
El treball realitzat en aquest projecte es basa en l'implementació d'un demostrador wireless, i més específicament, en l'estudi de les tècniques network coding i virtualització. Network coding és un nou mètode de transmissió de dades que es basa en la codificació de paquets per incrementar el rendiment fins ara obtingut als mètodes de transmissió convencionals. La virtualització és una tècnica que consisteix en compartir de forma més eficient els recursos d'un sistema. En el nostre cas s'utilitzarà la virtualització per dividir una interfície sense fils en diferents usuaris virtuals transmetent i rebent dades simultàniament. L'objectiu del projecte és realitzar un seguit de proves i estudis per veure els avantatges d'aquestes dues tècniques.