930 resultados para Superimposed Codes
Resumo:
Background: Ethics is defined as the entirety of moral principles that form the basis of individuals’ behavior; it can also be defined as “moral theory” or “theoretical ethics”. Objectives: To determinate information and applications related to ethical codes of pediatric nurses. Patients and Methods: Participants were nurses attending the Neonatal Intensive Care Unit Nursing Course and the Pediatric Nursing Course conducted in Istanbul between September 2011 and December 2012. A total of nurses attending the courses at the specified dates and who agreed to participate in the study were included in the analysis. Data were collected through a questionnaire that we developed in accordance with current literature on nursing ethics. Results: 140 nurses participated in this study. Information and applications were related to ethical codes of nurses including four categories; autonomy, beneficence, nonmaleficence, justice. The principle of confidentiality/keeping secrets. Exactly 64.3% of nurses reported having heard of nursing ethical codes. The best-known ethical code was the principle of justice. Furthermore, while the rates were generally low, some nurses engaged in unethical practices such as patient discrimination and prioritizing acquaintances. Conclusions: We conclude that most nurses working in pediatric clinics act in compliance with ethical codes. We also found that the majority of nurses wanted to learn about ethical codes. For this reason, we recommended that nurses working in clinics and future nurses in training be informed of the appropriate ethical behavior and codes.
Resumo:
We propose weakly-constrained stream and block codes with tunable pattern-dependent statistics and demonstrate that the block code capacity at large block sizes is close to the the prediction obtained from a simple Markov model published earlier. We demonstrate the feasibility of the code by presenting original encoding and decoding algorithms with a complexity log-linear in the block size and with modest table memory requirements. We also show that when such codes are used for mitigation of patterning effects in optical fibre communications, a gain of about 0.5dB is possible under realistic conditions, at the expense of small redundancy (≈10%). © 2010 IEEE
Resumo:
La percezione dello spazio urbano, nella sua complessità, risente inevitabilmente dello stratificarsi, nel tempo, di significati storici, ideologie, archetipi e utopie attraverso i quali la società, nei suoi diversi stadi di sviluppo, ha consolidato l'idea di centro abitato. Nel mondo contemporaneo, la città narrata si sovrappone sempre di più a quella reale, organizzando e sintetizzando i processi interpretativi dei circuiti urbani: al cityscape, il panorama fisico della città, si antepone il suo mindscape, il panorama dell'anima e delle culture urbane. In sintonia con tali prospettive, la presente ricerca si propone di analizzare i processi comunicativi e i paradigmi mediatici che attraversano e ridefiniscono le dinamiche urbane, prendendo in esame gli strumenti e i linguaggi che concorrono a disegnare e raccontare l'immagine di una città. In tale contesto, il progetto prende in considerazione come case study la singolare situazione del distretto universitario intorno a via Zamboni a Bologna: un'arteria di straordinaria bellezza e vitalità, cui tuttavia non corrisponde un'immagine pubblica altrettanto positiva. La tesi ha analizzato in particolare l’immagine pubblica e la percezione di via Zamboni e di piazza Verdi dai primi del Novecento a oggi, in relazione ai principali eventi che le hanno viste come scenari privilegiati. Prendendo in considerazione un arco di tempo di oltre un secolo, sono stati selezionati alcuni momenti topici, occasioni culturali o accadimenti con una forte connotazione simbolica: dalla Liberazione alle manifestazioni del ’77, dalle storiche ‘prime’ del Teatro Comunale agli allestimenti della Pinacoteca, dalle lezioni di professori universitari di chiara fama alle più recenti contestazioni studentesche. Il risultato è un racconto stratificato che attraversa segni e immagini per ricostruire l’iconografia del quartiere attraverso testi, fotografie, filmati, opere d’arte o prodotti multimediali.
Resumo:
In the framework of a global transition to a low-carbon energy mix, the interest in advanced nuclear Small Modular Reactors (SMRs) has been growing at the international level. Due to the high level of maturity reached by Severe Accident Codes for currently operating rectors, their applicability to advanced SMRs is starting to be studied. Within the present work of thesis and in the framework of a collaboration between ENEA, UNIBO and IRSN, an ASTEC code model of a generic IRIS reactor has been developed. The simulation of a DBA sequence involving the operation of all the passive safety systems of the generic IRIS has been carried out to investigate the code model capability in the prediction of the thermal-hydraulics characterizing an integral SMR adopting a passive mitigation strategy. The following simulation of 4 BDBAs sequences explores the applicability of Severe Accident Codes to advance SMRs in beyond-design and core-degradation conditions. The uncertainty affecting a code simulation can be estimated by using the method of Input Uncertainty Propagation, whose application has been realized through the RAVEN-ASTEC coupling and implementation on an HPC platform. This probabilistic methodology has been employed in a study of the uncertainty affecting the passive safety system operation in the DBA simulation of ASTEC, providing a further characterization of the thermal-hydraulics of this sequence. The application of the Uncertainty Quantification method to early core-melt phenomena has been investigated in the framework of a BEPU analysis of the ASTEC simulation of the QUENCH test-6 experiment. A possible solution to the encountered challenges has been proposed through the application of a Limit Surface search algorithm.
Resumo:
In the last few years there has been a great development of techniques like quantum computers and quantum communication systems, due to their huge potentialities and the growing number of applications. However, physical qubits experience a lot of nonidealities, like measurement errors and decoherence, that generate failures in the quantum computation. This work shows how it is possible to exploit concepts from classical information in order to realize quantum error-correcting codes, adding some redundancy qubits. In particular, the threshold theorem states that it is possible to lower the percentage of failures in the decoding at will, if the physical error rate is below a given accuracy threshold. The focus will be on codes belonging to the family of the topological codes, like toric, planar and XZZX surface codes. Firstly, they will be compared from a theoretical point of view, in order to show their advantages and disadvantages. The algorithms behind the minimum perfect matching decoder, the most popular for such codes, will be presented. The last section will be dedicated to the analysis of the performances of these topological codes with different error channel models, showing interesting results. In particular, while the error correction capability of surface codes decreases in presence of biased errors, XZZX codes own some intrinsic symmetries that allow them to improve their performances if one kind of error occurs more frequently than the others.
Resumo:
To estimate the impact of aging and diabetes on insulin sensitivity, beta-cell function, adipocytokines, and incretin production. Hyperglycemic clamps, arginine tests and meal tolerance tests were performed in 50 non-obese subjects to measure insulin sensitivity (IS) and insulin secretion as well as plasma levels of glucagon, GLP-1 and GIP. Patients with diabetes and healthy control subjects were divided into the following groups: middle-aged type 2 diabetes (MA-DM), aged Type 2 diabetes (A-DM) and middle-aged or aged subjects with normal glucose tolerance (MA-NGT or A-NGT). IS, as determined by the homeostasis model assessment, glucose infusion rate, and oral glucose insulin sensitivity, was reduced in the aged and DM groups compared with MA-NGT, but it was similar in the MA-DM and A-DM groups. Insulinogenic index, first and second phase insulin secretion and the disposition indices, but not insulin response to arginine, were reduced in the aged and DM groups. Postprandial glucagon production was higher in MA-DM compared to MA-NGT. Whereas the GLP-1 production was reduced in A-DM, no differences between groups were observed in GIP production. In non-obese subjects, diabetes and aging impair insulin sensitivity. Insulin production is reduced by aging, and diabetes exacerbates this condition. Aging associated defects superimposed diabetic physiopathology, particularly regarding GLP-1 production. On the other hand, the glucose-independent secretion of insulin was preserved. Knowledge of the complex relationship between aging and diabetes could support the development of physiopathological and pharmacological based therapies.
Resumo:
One of the great challenges of the scientific community on theories of genetic information, genetic communication and genetic coding is to determine a mathematical structure related to DNA sequences. In this paper we propose a model of an intra-cellular transmission system of genetic information similar to a model of a power and bandwidth efficient digital communication system in order to identify a mathematical structure in DNA sequences where such sequences are biologically relevant. The model of a transmission system of genetic information is concerned with the identification, reproduction and mathematical classification of the nucleotide sequence of single stranded DNA by the genetic encoder. Hence, a genetic encoder is devised where labelings and cyclic codes are established. The establishment of the algebraic structure of the corresponding codes alphabets, mappings, labelings, primitive polynomials (p(x)) and code generator polynomials (g(x)) are quite important in characterizing error-correcting codes subclasses of G-linear codes. These latter codes are useful for the identification, reproduction and mathematical classification of DNA sequences. The characterization of this model may contribute to the development of a methodology that can be applied in mutational analysis and polymorphisms, production of new drugs and genetic improvement, among other things, resulting in the reduction of time and laboratory costs.
Resumo:
The growth of organs and whole plants depends on both cell growth and cell-cycle progression, but the interaction between both processes is poorly understood. In plants, the balance between growth and cell-cycle progression requires coordinated regulation of four different processes: macromolecular synthesis (cytoplasmic growth), turgor-driven cell-wall extension, mitotic cycle, and endocycle. Potential feedbacks between these processes include a cell-size checkpoint operating before DNA synthesis and a link between DNA contents and maximum cell size. In addition, key intercellular signals and growth regulatory genes appear to target at the same time cell-cycle and cell-growth functions. For example, auxin, gibberellin, and brassinosteroid all have parallel links to cell-cycle progression (through S-phase Cyclin D-CDK and the anaphase-promoting complex) and cell-wall functions (through cell-wall extensibility or microtubule dynamics). Another intercellular signal mediated by microtubule dynamics is the mechanical stress caused by growth of interconnected cells. Superimposed on developmental controls, sugar signalling through the TOR pathway has recently emerged as a central control point linking cytoplasmic growth, cell-cycle and cell-wall functions. Recent progress in quantitative imaging and computational modelling will facilitate analysis of the multiple interconnections between plant cell growth and cell cycle and ultimately will be required for the predictive manipulation of plant growth.
Resumo:
Monte Carlo track structures (MCTS) simulations have been recognized as useful tools for radiobiological modeling. However, the authors noticed several issues regarding the consistency of reported data. Therefore, in this work, they analyze the impact of various user defined parameters on simulated direct DNA damage yields. In addition, they draw attention to discrepancies in published literature in DNA strand break (SB) yields and selected methodologies. The MCTS code Geant4-DNA was used to compare radial dose profiles in a nanometer-scale region of interest (ROI) for photon sources of varying sizes and energies. Then, electron tracks of 0.28 keV-220 keV were superimposed on a geometric DNA model composed of 2.7 × 10(6) nucleosomes, and SBs were simulated according to four definitions based on energy deposits or energy transfers in DNA strand targets compared to a threshold energy ETH. The SB frequencies and complexities in nucleosomes as a function of incident electron energies were obtained. SBs were classified into higher order clusters such as single and double strand breaks (SSBs and DSBs) based on inter-SB distances and on the number of affected strands. Comparisons of different nonuniform dose distributions lacking charged particle equilibrium may lead to erroneous conclusions regarding the effect of energy on relative biological effectiveness. The energy transfer-based SB definitions give similar SB yields as the one based on energy deposit when ETH ≈ 10.79 eV, but deviate significantly for higher ETH values. Between 30 and 40 nucleosomes/Gy show at least one SB in the ROI. The number of nucleosomes that present a complex damage pattern of more than 2 SBs and the degree of complexity of the damage in these nucleosomes diminish as the incident electron energy increases. DNA damage classification into SSB and DSB is highly dependent on the definitions of these higher order structures and their implementations. The authors' show that, for the four studied models, different yields are expected by up to 54% for SSBs and by up to 32% for DSBs, as a function of the incident electrons energy and of the models being compared. MCTS simulations allow to compare direct DNA damage types and complexities induced by ionizing radiation. However, simulation results depend to a large degree on user-defined parameters, definitions, and algorithms such as: DNA model, dose distribution, SB definition, and the DNA damage clustering algorithm. These interdependencies should be well controlled during the simulations and explicitly reported when comparing results to experiments or calculations.
Resumo:
A method using the ring-oven technique for pre-concentration in filter paper discs and near infrared hyperspectral imaging is proposed to identify four detergent and dispersant additives, and to determine their concentration in gasoline. Different approaches were used to select the best image data processing in order to gather the relevant spectral information. This was attained by selecting the pixels of the region of interest (ROI), using a pre-calculated threshold value of the PCA scores arranged as histograms, to select the spectra set; summing up the selected spectra to achieve representativeness; and compensating for the superimposed filter paper spectral information, also supported by scores histograms for each individual sample. The best classification model was achieved using linear discriminant analysis and genetic algorithm (LDA/GA), whose correct classification rate in the external validation set was 92%. Previous classification of the type of additive present in the gasoline is necessary to define the PLS model required for its quantitative determination. Considering that two of the additives studied present high spectral similarity, a PLS regression model was constructed to predict their content in gasoline, while two additional models were used for the remaining additives. The results for the external validation of these regression models showed a mean percentage error of prediction varying from 5 to 15%.
Resumo:
In this article, it is discussed the role of interaction in the process of teaching and learning Portuguese of deaf students at an inclusive school. In the context where the research took place, the hearing teacher does not understand sign language, and there are, in her classroom, hearing students and four deaf students, being three of them sign language users. As the communication between the hearing teacher and the deaf students occurred in different codes - Portuguese and Brazilian sign language - and having a social-interactional approach of language (MOITA LOPES, 1986; FREIRE, 1999), we observed if the interaction among the subjects enabled the deaf students to understand what was being taught. The results showed that the fact of having four deaf students in the same classroom allowed them to work in a cooperative way. Besides, the sign language became more visible in this institution. On the other hand, the interaction between the teacher and her deaf students revealed to be of little significance to the learning process of this small group.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
OBJECTIVE: This study aimed to asses oral health conditions in a population aged 60 years and over living in Botucatu, Southeastern Brazil. A cross-sectional population-based study was carried out using a random sample (N=372) of the urban population aged 60 years and over from the city of Botucatu, in 2005. World Health Organization criteria and codes for oral health epidemiological surveys were used. Re-examination was carried out in 10% of individuals aiming to evaluate intra-examiner agreement. Statistical analysis was performed by one-way ANOVA or Kruskal-Wallis ANOVA, as applicable. Also, the t-test was used in the absence of homoscedasticity. Fisher's exact test was used for situations where the categories with less than five units were observed. Adjusted residuals and multiple-comparison analysis were conducted to identify associations between variable categories and subgroups. The intra-examiner agreement was 98% and Kappa statistics result was 0.95. Loss component represented 90.68% of DMF-T index, which was 29.85. The prevalence of edentulism was 63.17%. Upper and lower dentures were found in 80% and 58% respectively, with complete denture as the most commonly used. In those studied, 15% required upper and 38% lower dentures. There was more need for complete denture in both jaws. Approximately 20% had soft tissue alterations. For periodontal conditions, most sextants were excluded (81.81%). Periodontal pockets (4 - 5 mm) were seen in 11.29% of the examined individuals. The oral health status of the elderly population in Botucatu is poor, as well as in other Brazilian cities. The results of this study may help planning collective health actions, giving an accurate description of the oral problems among the elderly.
Resumo:
OBJETIVO: Comparar o uso das codificações da classificação de doenças e agravos em solicitações de afastamento do trabalho por motivo odontológico. MÉTODOS: Foram analisadas 240 solicitações emitidas em um serviço público federal entre janeiro de 2008 e dezembro de 2009. O uso da Classificação Estatística Internacional de Doenças e Problemas Relacionados à Saúde - Décima Revisão (CID-10) foi comparado ao sistema de Classificação Internacional de Doenças em Odontologia e Estomatologia (CID-OE). Foi determinada a especificidade da codificação nas solicitações de afastamento, bem como da codificação atribuída por peritos oficiais em inspeções indiretas, perícias e juntas odontológicas. RESULTADOS: Do total de atestados, 22,9% não apresentaram a CID, 7,1% apresentaram a CID-9, 3,3% a CID-OE e 66,7% a CID-10. A maioria das codificações foi concordante (55,1%), com maior especificidade nas codificações atribuídas após avaliação dos cirurgiões-dentistas peritos oficiais. CONCLUSÕES: É necessário aperfeiçoar a utilização da CID-10 entre os profissionais de Odontologia e perícia odontológica no trabalho. Sugere-se a incorporação do uso da CID-OE e da Classificação Internacional de Funcionamento, Incapacidade e Saúde para a análise dos afastamentos do trabalho, fornecendo dados relevantes para o monitoramento do absenteísmo por motivo odontológico.