865 resultados para Cipher and telegraph codes


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Currently, business management is far from being recognised as a profession. This paper suggests that a professional spirit should be developed which could function as a filter of commercial reasoning. Broadly, management will not be organised within the framework of a well-established profession unless formal knowledge, licensing, professional autonomy and professional codes of conduct are developed sufficiently. In developing business management as a profession, law may play a key role. Where the idea is that business management should be more professsionalised, managers must show that they are willing to adopt ethical values, while arriving at business decisions. The paper argues that ethics cannot survive without legal regulation, which, in turn, will not be supported by law unless lawyers can find alternative solutions to the large mechanisms of the official society, secured by the monopolised coercion of the nation state. From a micro perspective of law and business ethics, communities can be developed with their own conventions, rules and standards that are generated and sanctioned within the boundaries of the communities themselves.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In their discussion - Professionalism and Ethics in Hospitality - by James R. Keiser, Associate Professor and John Swinton, Instructor, Hotel, Restaurant and Institutional Management, The Pennsylvania State University, Keiser and Swinton initially offer: “Referring to “the hospitality profession” necessitates thinking of the ethics of that profession and how ethics can be taught. The authors discuss what it means for the hospitality industry to be a profession.” The authors will have you know, a cursory nod to the term or description, profession and/or professional, is awarded to the hospitality industry at large; at least in an academic sense. Keiser and Swinton also want you to know that ethics, and professionalism are distinctly unique concepts, however, they are related. Their intangible nature does make them difficult, at best, to define, but ethics in contemporary hospitality has, to some degree, been charted and quantified. “We have left the caveat emptor era, and the common law, the Uniform Commercial Code, and a variety of local ordinances now dictate that the goods and services hospitality offers carry an implied warranty of merchantability,” the authors inform you. About the symbiotic relationship between ethics and professionalism, the authors say this: The less precise a code of ethics goes, the general rule, the fewer claims the group has to professional status.” The statement above may be considered a cornerstone principle. “However, the mere existence of an ethical code (or of professional status, for that matter) does not ensure ethical behavior in any group,” caution Keiser and Swinton. “Codes of ethics do not really define professionalism except as they adopt a group's special, arcane, exclusionary jargon. Worse, they can define the minimum, agreed-upon standards of conduct and thereby encourage ethical corner-cutting,” they further qualify the thought. And, in bridging academia, Keiser and Swinton say, “Equipped now with a sense of the ironies and ambiguities inherent in labeling any work "professional," we can turn to the problem of instilling in students a sense of what is professionally ethical. Students appear to welcome this kind of instruction, and while we would like to think their interest comes welling up from altruism and intellectual curiosity rather than drifting down as Watergate and malpractice fallout, our job is to teach, not to weigh the motives that bring us our students, and to provide a climate conducive to ethical behavior, not supply a separate answer for every contingency.” Keiser and Swinton illustrate their treatise on ethics via the hypothetical tale [stylized case study] of Cosmo Cuisiner, who manages the Phoenix, a large suburban restaurant. Cosmo is “…a typical restaurant manager faced with a series of stylized, over-simplified, but illustrative decisions, each with its own ethical skew for the students to analyze.” A shortened version of that case study is presented. Figure 1 outlines the State Restaurant Association Code of Ethics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many U.S. students do not perform well on mathematics assessments with respect to algebra topics such as linear functions, a building-block for other functions. Poor achievement of U.S. middle school students in this topic is a problem. U.S. eighth graders have had average mathematics scores on international comparison tests such as Third International Mathematics Science Study, later known as Trends in Mathematics and Science Study, (TIMSS)-1995, -99, -03, while Singapore students have had highest average scores. U.S. eighth grade average mathematics scores improved on TIMMS-2007 and held steady onTIMMS-2011. Results from national assessments, PISA 2009 and 2012 and National Assessment of Educational Progress of 2007, 2009, and 2013, showed a lack of proficiency in algebra. Results of curriculum studies involving nations in TIMSS suggest that elementary textbooks in high-scoring countries were different than elementary textbooks and middle grades texts were different with respect to general features in the U.S. The purpose of this study was to compare treatments of linear functions in Singapore and U.S. middle grades mathematics textbooks. Results revealed features currently in textbooks. Findings should be valuable to constituencies who wish to improve U.S. mathematics achievement. Portions of eight Singapore and nine U.S. middle school student texts pertaining to linear functions were compared with respect to 22 features in three categories: (a) background features, (b) general features of problems, and (c) specific characterizations of problem practices, problem-solving competency types, and transfer of representation. Features were coded using a codebook developed by the researcher. Tallies and percentages were reported. Welch's t-tests and chi-square tests were used, respectively, to determine whether texts differed significantly for the features and if codes were independent of country. U.S. and Singapore textbooks differed in page appearance and number of pages, problems, and images. Texts were similar in problem appearance. Differences in problems related to assessment of conceptual learning. U.S. texts contained more problems requiring (a) use of definitions, (b) single computation, (c) interpreting, and (d) multiple responses. These differences may stem from cultural differences seen in attitudes toward education. Future studies should focus on density of page, spiral approach, and multiple response problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The presence of high phase noise in addition to additive white Gaussian noise in coherent optical systems affects the performance of forward error correction (FEC) schemes. In this paper, we propose a simple scheme for such systems, using block interleavers and binary Bose–Chaudhuri–Hocquenghem (BCH) codes. The block interleavers are specifically optimized for differential quadrature phase shift keying modulation. We propose a method for selecting BCH codes that, together with the interleavers, achieve a target post-FEC bit error rate (BER). This combination of interleavers and BCH codes has very low implementation complexity. In addition, our approach is straightforward, requiring only short pre-FEC simulations to parameterize a model, based on which we select codes analytically. We aim to correct a pre-FEC BER of around (Formula presented.). We evaluate the accuracy of our approach using numerical simulations. For a target post-FEC BER of (Formula presented.), codes selected using our method result in BERs around 3(Formula presented.) target and achieve the target with around 0.2 dB extra signal-to-noise ratio.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the past few years, the number of wireless networks users has been increasing. Until now, Radio-Frequency (RF) used to be the dominant technology. However, the electromagnetic spectrum in these region is being saturated, demanding for alternative wireless technologies. Recently, with the growing market of LED lighting, the Visible Light Communications has been drawing attentions from the research community. First, it is an eficient device for illumination. Second, because of its easy modulation and high bandwidth. Finally, it can combine illumination and communication in the same device, in other words, it allows to implement highly eficient wireless communication systems. One of the most important aspects in a communication system is its reliability when working in noisy channels. In these scenarios, the received data can be afected by errors. In order to proper system working, it is usually employed a Channel Encoder in the system. Its function is to code the data to be transmitted in order to increase system performance. It commonly uses ECC, which appends redundant information to the original data. At the receiver side, the redundant information is used to recover the erroneous data. This dissertation presents the implementation steps of a Channel Encoder for VLC. It was consider several techniques such as Reed-Solomon and Convolutional codes, Block and Convolutional Interleaving, CRC and Puncturing. A detailed analysis of each technique characteristics was made in order to choose the most appropriate ones. Simulink models were created in order to simulate how diferent codes behave in diferent scenarios. Later, the models were implemented in a FPGA and simulations were performed. Hardware co-simulations were also implemented to faster simulation results. At the end, diferent techniques were combined to create a complete Channel Encoder capable of detect and correct random and burst errors, due to the usage of a RS(255,213) code with a Block Interleaver. Furthermore, after the decoding process, the proposed system can identify uncorrectable errors in the decoded data due to the CRC-32 algorithm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose weakly-constrained stream and block codes with tunable pattern-dependent statistics and demonstrate that the block code capacity at large block sizes is close to the the prediction obtained from a simple Markov model published earlier. We demonstrate the feasibility of the code by presenting original encoding and decoding algorithms with a complexity log-linear in the block size and with modest table memory requirements. We also show that when such codes are used for mitigation of patterning effects in optical fibre communications, a gain of about 0.5dB is possible under realistic conditions, at the expense of small redundancy (≈10%). © 2010 IEEE

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the great challenges of the scientific community on theories of genetic information, genetic communication and genetic coding is to determine a mathematical structure related to DNA sequences. In this paper we propose a model of an intra-cellular transmission system of genetic information similar to a model of a power and bandwidth efficient digital communication system in order to identify a mathematical structure in DNA sequences where such sequences are biologically relevant. The model of a transmission system of genetic information is concerned with the identification, reproduction and mathematical classification of the nucleotide sequence of single stranded DNA by the genetic encoder. Hence, a genetic encoder is devised where labelings and cyclic codes are established. The establishment of the algebraic structure of the corresponding codes alphabets, mappings, labelings, primitive polynomials (p(x)) and code generator polynomials (g(x)) are quite important in characterizing error-correcting codes subclasses of G-linear codes. These latter codes are useful for the identification, reproduction and mathematical classification of DNA sequences. The characterization of this model may contribute to the development of a methodology that can be applied in mutational analysis and polymorphisms, production of new drugs and genetic improvement, among other things, resulting in the reduction of time and laboratory costs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents an investigation of design code provisions for steel-concrete composite columns. The study covers the national building codes of United States, Canada and Brazil, and the transnational EUROCODE. The study is based on experimental results of 93 axially loaded concrete-filled tubular steel columns. This includes 36 unpublished, full scale experimental results by the authors and 57 results from the literature. The error of resistance models is determined by comparing experimental results for ultimate loads with code-predicted column resistances. Regression analysis is used to describe the variation of model error with column slenderness and to describe model uncertainty. The paper shows that Canadian and European codes are able to predict mean column resistance, since resistance models of these codes present detailed formulations for concrete confinement by a steel tube. ANSI/AISC and Brazilian codes have limited allowance for concrete confinement, and become very conservative for short columns. Reliability analysis is used to evaluate the safety level of code provisions. Reliability analysis includes model error and other random problem parameters like steel and concrete strengths, and dead and live loads. Design code provisions are evaluated in terms of sufficient and uniform reliability criteria. Results show that the four design codes studied provide uniform reliability, with the Canadian code being best in achieving this goal. This is a result of a well balanced code, both in terms of load combinations and resistance model. The European code is less successful in providing uniform reliability, a consequence of the partial factors used in load combinations. The paper also shows that reliability indexes of columns designed according to European code can be as low as 2.2, which is quite below target reliability levels of EUROCODE. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lightning-induced overvoltages have a considerable impact on the power quality of overhead distribution and telecommunications systems, and various models have been developed for the computation of the electromagnetic transients caused by indirect strokes. The most adequate has been shown to be the one proposed by Agrawal et al.; the Rusck model can be visualized as a particular case, as both models are equivalent when the lightning channel is perpendicular to the ground plane. In this paper, an extension of the Rusck model that enables the calculation of lightning-induced transients considering flashes to nearby elevated structures and realistic line configurations is tested against data obtained from both natural lightning and scale model experiments. The latter, performed under controlled conditions, can be used also to verify the validity of other coupling models and relevant codes. The so-called Extended Rusck Model, which is shown to be sufficiently accurate, is applied to the analysis of lightning-induced voltages on lines with a shield wire and/or surge arresters. The investigation conducted indicates that the ratio between the peak values of the voltages induced by typical first and subsequent strokes can be either greater or smaller than the unity, depending on the line configuration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Amongst the infectious diseases that threaten equine health, herpesviral infections remain a world wide cause of serious morbidity and mortality. Equine herpesvirus-1 infection is the most important pathogen, causing an array of disorders including epidemic respiratory disease abortion, neonatal foal death, myeloencephalopathy and chorioretinopathy. Despite intense scientific investigation, extensive use of vaccination, and established codes of practice for control of disease outbreaks, infection and disease remain common. While equine herpesvirus-1 infection remains a daunting challenge for immunoprophylaxis, many critical advances in equine immunology have resulted in studies of this virus, particularly related to MHC-restricted cytotoxicity in the horse. A workshop was convened in San Gimignano, Tuscany, Italy in June 2004, to bring together clinical and basic researchers in the field of equine herpesvirus-1 study to discuss the latest advances and future prospects for improving our under-standing of these diseases, and equine immunity to herpesviral infection. This report highlights the new information that was the focus of this workshop, and is intended to summarize this material and identify the critical questions in the field. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A procedure for coupling mesoscale and CFD codes is presented, enabling the inclusion of realistic stratification flow regimes and boundary conditions in CFD simulations of relevance to site and resource assessment studies in complex terrain. Two distinct techniques are derived: (i) in the first one, boundary conditions are extracted from mesoscale results to produce time-varying CFD solutions; (ii) in the second case, a statistical treatment of mesoscale data leads to steady-state flow boundary conditions believed to be more representative than the idealised profiles which are current industry practice. Results are compared with measured data and traditional CFD approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

RESUMO - Este trabalho de projecto visa responder à questão de saber como gerir uma unidade de ética num serviço público central de saúde pública, “de forma racional e informada”, definindo os seus objectivos estratégicos (Baranger, citando Drucker, 1990) utilizando como caso de estudo o Gabinete de Assuntos Jurídicos, Ética e Responsabilidade, adiante designado por Gabinete, da Direcção‐Geral da Saúde. Para o efeito, fez‐se, em primeiro lugar, uma abordagem teórica descritiva das bases filosóficas da ética realçando a sua aplicação prática na determinação das características dos sistemas de saúde. Em seguida, analisa‐se a utilização do conceito de ética no âmbito da Saúde Pública, no contexto da bioética, verificando‐se elementos distintivos que parecem justificar a autonomização do conceito de ‘Ética em Saúde Pública’. Para tal, foram consultadas as principais fontes de princípios éticos em saúde, tais como a Declaração Universal dos Direitos do Homem, a Declaração de Helsínquia, bem como a Constituição da República Portuguesa e os Códigos Deontológicos das profissões de saúde. Nesta fase do trabalho é pesquisada, nas perspectivas nacional e internacional, a existência de unidades de ética, congéneres ou de âmbito similar, bem como respectivas áreas e níveis de intervenção, tendo‐se nesse sentido auscultado as entidades idóneas dos Estados‐Membros da União Europeia. Na segunda parte do trabalho de projecto, desenvolveu‐se o planeamento estratégico através da aplicação da metodologia balanced scorecard, apresentando‐se uma proposta de objectivos estratégicos e iniciativas a serem desenvolvidas pelo gabinete de ética sub judice, para um horizonte temporal fixado em três anos. Da utilização desta metodologia resultaram doze objectivos estratégicos, dos quais se destacam: ‘fomentar a discussão ética’; ‘promover a igualdade dos utentes do SNS’; e ‘identificar prioridades de actuação’. Entre as iniciativas a desenvolver salienta‐se o desenho de um questionário, a aplicar às comissões de ética do sistema de saúde com o objectivo de identificar prioridades de actuação do Gabinete. O trabalho finaliza‐se com as conclusões, recomendações e linhas de investigação que se considera deverem ser desenvolvidas, num futuro próximo, para o aprofundamento da matéria alvo deste estudo. ------------------ABSTRACT - This research‐project aims to answer the question of how to manage a unit of ethics within the directorate‐general of public health in a "rational and informed” way, defining their strategic goals (Baranger, quoting Drucker, 1990) using as case study the Office of Legal Affairs, Ethics and Responsibility, hereinafter referred as the Office, of the Directorate‐General of Health. For this purpose, the first part of the study, includes a framework description of the main philosophical basis of ethics, emphasizing that its practical application determines the characteristics of health systems; the use of the concept of ethics of Public Health in the context of bioethics was analyzed, and distinctive elements were found that seem to justify the autonomy of the concept of 'Ethics of Public Health'. The main sources of this part were the fundamental ethical principles in health, such as the Universal Declaration of Human Rights, the Helsinki Declaration, and also the Constitution of the Portuguese Republic and the Codes of Ethics of the health professions. At this stage of the study a description is also made, at both a national and international perspective, on the existence of similar units of ethics or with similar scope, and their areas and levels of intervention. For the international dimension the appropriate bodies of the Member States the European Union were consulted. In the second part of the research‐project, a strategic planning for the Office was designed, using the balanced scorecard methodology, and a proposal of the strategic objectives and initiatives to be developed within a time schedule of three years are presented. The use of this method resulted in twelve strategic objectives, among which we note the following: 'to promote the ethical discussion'; 'to promote equality of users of the NHS'; and ‘to identify priorities for action’. The design of a questionnaire to be answered by the ethics committees for health of the Portuguese health system, in order to identify priorities for the Office’s activities is also presented in the study. The work ends with the conclusions and recommendations, as well as a suggestion of lines for future research to further investigate the subject of this study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O objetivo deste trabalho consiste em efetuar o dimensionamento estrutural de um edifício em betão armado, contemplando as diferentes fases, desde a conceção inicial, com a definição do modelo estrutural e escolha criteriosa dos elementos e soluções constituintes, até à fase final de dimensionamento, considerando para além das cargas gravíticas, a ação do vento e a ação sísmica. No âmbito deste trabalho considerou-se o dimensionamento de elementos estruturais nomeadamente, sapatas, paredes, pilares, vigas e lajes, com a verificação de segurança à flexão simples, flexão composta, esforço transverso e punçoamento, consoante a necessidade de cada elemento. Para tal, foi desenvolvido uma folha de cálculo automático (Macro) que permite a verificação da capacidade resistente de secções, à flexão simples e ao esforço transverso, quer em elementos com ou sem armadura de esforço transverso. Os esforços atuantes que estiveram na origem das verificações estruturais foram calculados com base na aplicação de um programa tridimensional de elementos finitos, nomeadamente o programa de cálculo ROBOT STRUCTURAL ANALYSIS. Os Critérios Gerais de Dimensionamento considerados, com base na regulamentação em vigor em Portugal – RSA, REBAP e Eurocódigos, bem como as Hipóteses de Cálculo consideradas na verificação aos estados limites últimos dos elementos estruturais são detalhadamente enunciados ao longo do trabalho. Os desenhos de elementos estruturais dimensionados, bem como os desenhos de dimensionamento do edifício encontram-se em Anexo.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

RESUMO - A Segurança do Doente tem assumido uma relevância crescente nas organizações de saúde, resultado da divulgação de diversos estudos que revelaram a magnitude deste problema e simultaneamente, de uma maior pressão por parte da opinião pública e da comunicação social. Este estudo pretende desenvolver e avaliar a performance de um sistema eletrónico de deteção de eventos adversos, baseado num Data Warehouse, por comparação com os resultados obtidos pela metodologia tradicional de revisão dos registos clínicos. O objetivo principal do trabalho consistiu em identificar um conjunto de triggers / indicadores de alerta que permitam detetar potenciais eventos adversos mais comuns. O sistema desenvolvido apresentou um Valor Preditivo Positivo de 18.2%, uma sensibilidade de 65.1% e uma especificidade de 68.6%, sendo constituído por nove indicadores baseados em informação clínica e 445 códigos do ICD-9-CM, relativos a diagnósticos e procedimentos. Apesar de terem algumas limitações, os sistemas eletrónicos de deteção de eventos adversos apresentam inúmeras potencialidades, nomeadamente a utilização em tempo real e em complemento a metodologias já existentes. Considerando a importância da problemática em análise e a necessidade de aprofundar os resultados obtidos neste trabalho de projeto, seria relevante a sua extensão a um universo mais alargado de instituições hospitalares, estando a sua replicabilidade facilitada, uma vez que o Data Warehouse tem por base um conjunto de aplicações disseminadas a nível nacional. O desenvolvimento e a consolidação dos sistemas eletrónicos de deteção de eventos adversos constitui inegavelmente uma área de futuro, com reflexos ao nível da melhoria da informação existente nas organizações e que contribuirá decisivamente para a melhoria dos cuidados de saúde prestados aos doentes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação de mestrado em Ciências da Comunicação (área de especialização em Informação e Jornalismo)