907 resultados para Stabilisation of filter
Resumo:
Mr. Korosenyi begins by analysing the particular relationship holding between politics and administration in different countries. Within Europe three major patterns have emerged in the 20th century. Firstly there is the politically neutral British Civil Service, secondly the German and French state bureaucracies, which traditionally are supposed to embody the "common good", and thirdly there is the patronage system of the so-called consociate democracies, e.g. Austria. In general Mr. Korosenyi believes that, though politics do not penetrate into the Hungarian administration to the extent they do in Belgium and Austria, nevertheless, there is a stronger fusion than there is in the traditional British pattern. He is particularly interested in this relationship with regard to its effect on democratic institution building and the stabilisation of the new regime in Hungary, now the old "nomenklatura" system has been abolished. The structure of the Hungarian government was a result of the constitutional amendments of 1989 and 1990. Analysing this period, it becomes clear that for all the political actors who initiated and supported the democratic transition to democracy, the underlying assumption was a radical depoliticisation of the administration in order to maintain its stability. The political leadership of the executive is a cabinet government. The government is structured along ministries, each headed by a politician, i.e. the minister, who is a member of the cabinet. The minister's political secretary is not a cabinet member, but he or she is a politician, usually a member of the parliament. The head of the administration of the ministry is the administrative state secretary, who is a civil servant. He or she usually has four deputies, also civil servants. Naturally it is assumed that there should be a clear separation between politicians and civil servants. However in practice, the borders can be blurred, giving rise to a hybrid known as the "political civil servant". Mr. Korosenyi analyses the different faces of these hybrids. They are civil servants for the following reasons. They need special educational qualifications, working experience, a civil service exam etc., they are not allowed to do anything which is incompatible with their impartial role, and they cannot occupy political office nor may they appear in the name of any political party. On the other hand, the accepted political dimension to their function is revealed by the following facts. The state secretary (a civil servant) may participate in cabinet meetings instead of the minister. The state secretary is employed by the minister. A state secretary or any of their deputies can be dismissed at any time by the minister or the prime minister. In practice then, ministers appoint to these senior administrative positions civil servants whose personal and political loyaties are strong. To the second level of political patronage in ministries belong the ministerial cabinet, press office and public relation office. The ministerial cabinet includes the private advisors and members of the personal staff of the minister. The press office and the PR office, if they exist, are not adjusted to the administrative hierarchy of the ministry, but under the direct control of the minister. In the beginning of the 1990s, such offices were exceptions; in the second half of the 90s they are accepted and to be found in most ministries. Mr. Korosenyi's work, a 92-page manuscript of a book in Hungarian, marks the first piece of literature within the field of political science which analyses the structure of the Hungarian government in the 1990s and the relationship between the political leadership and the public administration.
Resumo:
With proper application of Best Management Practices (BMPs), the impact from the sediment to the water bodies could be minimized. However, finding the optimal allocation of BMP can be difficult, since there are numerous possible options. Also, economics plays an important role in BMP affordability and, therefore, the number of BMPs able to be placed in a given budget year. In this study, two methodologies are presented to determine the optimal cost-effective BMP allocation, by coupling a watershed-level model, Soil and Water Assessment Tool (SWAT), with two different methods, targeting and a multi-objective genetic algorithm (Non-dominated Sorting Genetic Algorithm II, NSGA-II). For demonstration, these two methodologies were applied to an agriculture-dominant watershed located in Lower Michigan to find the optimal allocation of filter strips and grassed waterways. For targeting, three different criteria were investigated for sediment yield minimization, during the process of which it was found that the grassed waterways near the watershed outlet reduced the watershed outlet sediment yield the most under this study condition, and cost minimization was also included as a second objective during the cost-effective BMP allocation selection. NSGA-II was used to find the optimal BMP allocation for both sediment yield reduction and cost minimization. By comparing the results and computational time of both methodologies, targeting was determined to be a better method for finding optimal cost-effective BMP allocation under this study condition, since it provided more than 13 times the amount of solutions with better fitness for the objective functions while using less than one eighth of the SWAT computational time than the NSGA-II with 150 generations did.
Resumo:
Pentatricopeptide repeat domain protein 1 (PTCD1) is a novel human protein that was recently shown to decrease the levels of mitochondrial leucine tRNAs. The physiological role of this regulation, however, remains unclear. Here we show that amino acid starvation by leucine deprivation significantly increased the mRNA steady-state levels of PTCD1 in human hepatocarcinoma (HepG2) cells. Amino acid starvation also increased the mitochondrially encoded leucine tRNA (tRNA(Leu(CUN))) and the mRNA for the mitochondrial leucyl-tRNA synthetase (LARS2). Despite increased PTCD1 mRNA steady-state levels, amino acid starvation decreased PTCD1 on the protein level. Decreasing PTCD1 protein concentration increases the stability of the mitochondrial leucine tRNAs, tRNA(Leu(CUN)) and tRNA(Leu(UUR)) as could be shown by RNAi experiments against PTCD1. Therefore, it is likely that decreased PTCD1 protein contributes to the increased tRNA(Leu(CUN)) levels in amino acid-starved cells. The stabilisation of the mitochondrial leucine tRNAs and the upregulation of the mitochondrial leucyl-tRNA synthetase LARS2 might play a role in adaptation of mitochondria to amino acid starvation.
Resumo:
Frequency selective surfaces (FSS) and reflect/trasmitarrays are mostly designed on the basis ot optimization using an electromagnetic simulator. That is a time consuming method and some decisions have to be taken using simply intuition. The use of equivalent circuits of the scatterers selected for the design allows the substitution of the intuition and most part of the optimization process by the application of the classic rules of filter design. This communication presents all the steps necessary to obtain the equivalent circuit of different square scatterers in a periodic lattice and to implement the desired FSS frequency behaviour calculating the number of layers and the dimensions of the periodic cells. Several examples are included to evaluate the results.
Resumo:
Este proyecto se centra en la implementación de un sistema de control activo de ruido mediante algoritmos genéticos. Para ello, se ha tenido en cuenta el tipo de ruido que se quiere cancelar y el diseño del controlador, parte fundamental del sistema de control. El control activo de ruido sólo es eficaz a bajas frecuencias, hasta los 250 Hz, justo para las cuales los elementos pasivos pierden efectividad, y en zonas o recintos de pequeñas dimensiones y conductos. El controlador ha de ser capaz de seguir todas las posibles variaciones del campo acústico que puedan producirse (variaciones de fase, de frecuencia, de amplitud, de funciones de transferencia electro-acústicas, etc.). Su funcionamiento está basado en algoritmos FIR e IIR adaptativos. La elección de un tipo de filtro u otro depende de características tales como linealidad, causalidad y número de coeficientes. Para que la función de transferencia del controlador siga las variaciones que surgen en el entorno acústico de cancelación, tiene que ir variando el valor de los coeficientes del filtro mediante un algoritmo adaptativo. En este proyecto se emplea como algoritmo adaptativo un algoritmo genético, basado en la selección biológica, es decir, simulando el comportamiento evolutivo de los sistemas biológicos. Las simulaciones se han realizado con dos tipos de señales: ruido de carácter aleatorio (banda ancha) y ruido periódico (banda estrecha). En la parte final del proyecto se muestran los resultados obtenidos y las conclusiones al respecto. Summary. This project is focused on the implementation of an active noise control system using genetic algorithms. For that, it has been taken into account the noise type wanted to be canceled and the controller design, a key part of the control system. The active noise control is only effective at low frequencies, up to 250 Hz, for which the passive elements lose effectiveness, and in small areas or enclosures and ducts. The controller must be able to follow all the possible variations of the acoustic field that might be produced (phase, frequency, amplitude, electro-acoustic transfer functions, etc.). It is based on adaptive FIR and IIR algorithms. The choice of a kind of filter or another depends on characteristics like linearity, causality and number of coefficients. Moreover, the transfer function of the controller has to be changing filter coefficients value thought an adaptive algorithm. In this project a genetic algorithm is used as adaptive algorithm, based on biological selection, simulating the evolutionary behavior of biological systems. The simulations have been implemented with two signal types: random noise (broadband) and periodic noise (narrowband). In the final part of the project the results and conclusions are shown.
Resumo:
La restauración fílmica del audio es un proceso bastante complejo y se ha indagado poco en este campo. Antes de restaurar cualquier archivo, se debe preservar y conservar los archivos de la mejor manera posible. La preservación son las medidas que se deben tomar para garantizar el acceso permanente y la conservación asegura le existencia del archivo en su forma más original. Mientras que la restauración se basa en el estudio de los posibles deterioros que sufren los soportes fílmicos en el tiempo y los procesos que existen para corregirlos. La restauración siempre debe conservar la mayor originalidad posible, es decir debe mantener el audio como originalmente se expuso por primera vez. En la primera etapa, se identifican los posibles deterioros que se producen en los archivos, si conocemos en qué momento fue grabada la películas y cómo fue grabada, es decir con que máquina se realizó la grabación y el soporte fílmico en el que está grabado. Tanto las máquinas como los soportes han ido evolucionando a lo largo de la historia. El estudio de los soportes fílmicos nos permite conocer las degradaciones que sufren a lo largo del tiempo los archivos y por consecuencia, conocer las posibles restauraciones. Para intentar evitar degradaciones mayores, se intenta preservar y conservar en condiciones óptimas para el soporte. Según el soporte del archivo, tendrá unas condiciones típicas de temperatura, humedad, ventilación… en las cuales el material se conserva de la mejor manera. Tras estos pasos, se procede a restaurar. La restauración más típica es con materiales fotoquímicos, pero es bastante compleja y por tanto, en el proyecto se analiza la restauración tras digitalizar los archivos fílmicos. Para poder digitalizar correctamente los archivos, debemos tener presentes las normas y reglas de digitalización que están establecidas. La digitalización permite identificar las alteraciones típicas que aparecen en los materiales fílmicos, gracias a la herramienta del espectrograma podemos conocer las posibles soluciones de restauración para cada alteración. Las alteraciones que podemos encontrar e identificar son: · Zumbidos e Interferencias. · Siseo y Silbido. · Crujidos. · Pops y Clics. · Wow. · Lagunas o Abandonos. · Ruidos intermitentes. · Reverberación. La última parte del proyecto, una vez que se tienen todas las alteraciones típicas de los archivos fílmicos identificadas, se procede al estudio de cada una de ellas con las herramientas del espectrograma y se realiza el estudio de una manera más técnica. Con el espectrograma se determinan las herramientas que solucionan cada alteración como Reverb para la reverberación, Decrackle para los crujidos… y en el marco técnico se determina las características que tiene cada herramienta, es decir el tipo de filtro, ventana… que se puede utilizar para poder restaurar el audio de cada alteración. La restauración digital es un campo aún por investigar, pero se debería de empezar a concienciar que es una solución factible. Que este tipo de restauración puede mantener el sonido original y no va a modificar los archivos, como muchas veces se piensa. Ya que el paso del tiempo, poco a poco, ira degradando y destruyendo los soportes fílmicos en los que se encuentran, y el principal objetivo que se pretende conseguir es que los materiales fílmicos perduren a lo largo de la historia. ABSTRACT. The film audio restoration is a fairly complex process and little research has been done in this field. Before restoring any files, you must preserve and keep the files in the best way possible. The preservation is the measures to be taken to ensure continued access to and preservation ensures existence of the file in its original form. The restoration is based on the study of possible damage suffered by the film media in time and the processes that exist to correct them. The restoration must always retain the most original as possible, i.e. to keep the audio as originally discussed for the first time. In the first stage, potential impairments that occur in the files are identified, if you know what time it was recorded the movies and how it was recorded, i.e. that machine recording and film media on which is recorded took place. Both machines as media have evolved throughout history. The study of film media lets us know the suffering degradations over time and result files, make possible restorations. To try to prevent further degradation, are intended to preserve and keep in good condition for support. Depending on the media file, will have typical conditions of temperature, humidity, ventilation... in which the material is preserved in the best way. After these steps, we proceed to restore. The most typical is with photochemical restoration materials, but is rather complex and therefore the restoration project is analyzed after scanning film archives. To successfully scan the files must be aware of the rules and regulations are established digitization. Digitization allows identifying the typical alterations that appear in the film materials, thanks to the tool spectrogram we know the possible restoration solutions for each alteration. The alterations that can find and identify are: · Buzz and Interference. · Hiss and Hissing. · Crackle. · Pops and Clicks. · Wow and Flutter. · Audio Dropouts. The last part of the project, when we have all the typical alterations identified film archives, proceed to the study of each of them with the tools of spectrogram and the study of a more technical way is done . With the spectrogram tools that solve every alteration as Reverb for reverb, Decrackle for cracks... and the technical framework the features that each tool is determined, i.e. the type of filter, window... that can be used are determined for to restore the audio of each alteration. Digital restoration is an area for future research, but should start aware that it is a feasible solution. This type of restoration can keep the original sound and will not modify files, as is often thought. Since the passage of time, gradually degrading and destroying anger film media in which they are, and the main objective to be achieved is that the film materials endure throughout history.
Resumo:
Insight into the dependence of benthic communities on biological and physical processes in nearshore pelagic environments, long considered a “black box,” has eluded ecologists. In rocky intertidal communities at Oregon coastal sites 80 km apart, differences in abundance of sessile invertebrates, herbivores, carnivores, and macrophytes in the low zone were not readily explained by local scale differences in hydrodynamic or physical conditions (wave forces, surge flow, or air temperature during low tide). Field experiments employing predator and herbivore manipulations and prey transplants suggested top-down (predation, grazing) processes varied positively with bottom-up processes (growth of filter-feeders, prey recruitment), but the basis for these differences was unknown. Shore-based sampling revealed that between-site differences were associated with nearshore oceanographic conditions, including phytoplankton concentration and productivity, particulates, and water temperature during upwelling. Further, samples taken at 19 sites along 380 km of coastline suggested that the differences documented between two sites reflect broader scale gradients of phytoplankton concentration. Among several alternative explanations, a coastal hydrodynamics hypothesis, reflecting mesoscale (tens to hundreds of kilometers) variation in the interaction between offshore currents and winds and continental shelf bathymetry, was inferred to be the primary underlying cause. Satellite imagery and offshore chlorophyll-a samples are consistent with the postulated mechanism. Our results suggest that benthic community dynamics can be coupled to pelagic ecosystems by both trophic and transport linkages.
Resumo:
Este trabalho teve como objetivo avaliar diversos métodos de detecção e recuperação de cistos de Giardia spp. e de oocistos de Cryptosporidium parvum em resíduos gerados no tratamento de águas de abastecimento com turbidez elevada tendo como padrão o Método 1623.1 da USEPA (2012 ). Para tanto, ensaios utilizando aparelho Jarteste (coagulação, floculação, decantação e filtração ) foram realizados utilizando o coagulante cloreto de polialumínio - PAC. Em todos os métodos avaliados foi utilizada a técnica de purificação por separação imunomagnética - IMS. A adaptação do método floculação em carbonato de cálcio FCCa elaborado por Vesey et al. (1993) e adaptado por Feng et al. (2011), repercutiu nos melhores resultados para a amostra de resíduo sedimentado, com recuperações de 68 ± 17 % para oocisto de C. parvum e de 42 ± 7 % para cisto de Giardia spp. Entretanto, as recuperações para a amostra de água de lavagem dos filtros - ALF foram inferiores à 1 %, não sendo possível determinar um método adequado. A presença dos patógenos indica que o reuso da ALF em ETA convencionais ou o descarte em mananciais sem um tratamento prévio, pode representar problemas de contaminação. A adaptação dos métodos de Boni de Oliveira (2012) e Keegan et al. (2008), também repercutiram em porcentagens de recuperação expressivas para a amostra de resíduo sedimentado, sendo de: 41 ± 35 % para oocisto de C. parvum e 11 ± 70 % para cisto de Giardia spp., e 38 ± 26 % para oocisto de C. parvum e 26 ± 13 % para cisto de Giardia spp., respectivamente. A análise estatística não resultou em diferença significativa entre estes dois métodos, entretanto, as elevadas recuperações indicam que estes métodos podem ser melhor avaliados em pesquisas futuras.
Resumo:
A obtenção de etanol a partir de rotas alcoolquímicas consagradas gera resíduos com potencial de aproveitamento, tanto em outros setores produtivos como no ciclo produtivo do próprio combustível. Este é o caso de torta de filtro e vinhaça. A vinhaça em particular costuma ser devolvida ao campo com o intuito de ajustar teores nutricionais do solo no cultivo da cana. No entanto, estudos ambientais destacam que esta alternativa traz efeitos negativos sobre os meios receptores (água e solo), condição que abre a perspectiva para exploração de usos alternativos dessas substâncias. Este estudo se propôs a contribuir para o tema ao avaliar de forma sistêmica o desempenho ambiental de duas alternativas de reaproveitamento de vinhaça: (i) reuso no campo em processos de fertirrigação, alternativa consolidada no Brasil, e (ii) reuso da fração líquida da vinhaça em etapas diversas do processo industrial de obtenção de etanol. Em qualquer das situações fez-se uso da técnica de Avaliação de Ciclo de Vida - ACV para proceder tal verificação. A análise ambiental da prática de reuso de vinhaça e torta para fertirrigação foi conduzida a partir da comparação de cenários que consideraram a forma de suprimento de nutrientes para a cana e o método de colheita. A avaliação de impactos ocorreu em dois níveis: quanto ao consumo de recursos, a partir de Primary Energy Demand (PED); e em termos de emissões para o ambiente por meio da elaboração do Perfil Ambiental. Uma Análise de Sensibilidade foi também realizada para verificar o efeito de oscilações dos teores de Nitrogênio (N), Fósforo (P) e Potássio (K) na composição da vinhaça sobre os resultados obtidos, caso da primeira alternativa. Concluiu-se para esse caso que a substituição parcial de fertilizantes químicos por vinhaça traz aumento da Demanda de Energia Primária global para ambos os métodos de colheita. Em termos de Perfil Ambiental, a comparação entre cenários das mesmas práticas de manejo mostrou que a troca de adubos por vinhaça e torta é positiva para o desempenho ambiental do etanol por reduzir impactos quanto a Mudanças Climáticas (CC), Acidificação Terrestre (TA) e Toxicidade Humana (HT). Por outro lado, o tratamento de vinhaça para reposição de água na etapa industrial resultou em aumento global das contribuições para as categorias acima mencionadas além de incrementos para Eutrofização de água doce (FEut) e Ecotoxicidade de água doce (FEC), a despeito de ser constatada a redução de 45% quanto a Depleção de água (WD). Os aumentos no impacto se deveram principalmente aos efeitos negativos causados durante a produção do CaO usado no processo de tratamento da vinhaça. No entanto, a substituição deste insumo por NaOH só representou melhora em termos de CC. Pode-se concluir que a reutilização de vinhaça e de torta de filtro como complemento nutricional para o cultivo de cana-de-açúcar resultou em uma alternativa mais adequada de reaproveitamento do que o se este fluído fosse reutilizado para suprir parte da demanda hídrica de processo, mesmo quando o consumo de água na etapa industrial tenha inexoravelmente se reduzido a partir da implantação dessa medida.
Resumo:
Este estudo avaliou a eficiência da oleuropeína (OLE) (composto fenólico extraído das folhas de Oliveira) isolada e associada aos sanitizantes comerciais ácido peracético 2% (APA), hipoclorito de sódio 2% (HS), peróxido de hidrogênio 3% (PH), digluconato de clorexidina 2% (DC), cloreto de benzalcônio 1% (CB) e iodofor 2% (IO), para inativação de células em suspensão e biofilmes monoespécie e multiespécie formados em superfícies de aço inoxidável ou microplaca de poliestireno por Listeria monocytogenes (ATCC 7644), Staphylococcus aureus (ATCC 25923) e Escherichia coli (ATCC 25922), todas classificadas como fortes produtores de biofilmes. Os isolados foram semeados em caldo TSB (caldo tripticase soja), incubados (37°C/24h) e corrigidos a ~108células/mL (escala 0,5 McFarland). Para bactérias em suspensão, a resistência a sanitizantes foi determinada pela Concentração Inibitória Mínima (CIM) em tubos e pelo método de Disco Difusão em Ágar (DDA), no qual as bactérias foram plaqueadas em ágar TSA contendo discos de 6mm de papel filtro embebidos nos sanitizantes. Após a incubação, a medição dos halos de inibição foi feita com paquímetro. Para os ensaios de resistência dos biofilmes aos compostos sanitizantes, foram utilizadas microplacas de poliestireno 96 poços, as quais foram preparadas para incubação-fixação dos biofilmes e submetidas à leitura em espectrofotômetro de ELISA (600 nm). Em seguida, as placas foram lavadas com solução salina tamponada (PBS, pH 7.4) e os sanitizantes inseridos por 1 minuto. Após neutralização com tiossulfato de sódio (5 minutos), as placas foram lavadas com PBS e metanol, coradas com cristal violeta 1% e coradas com ácido acético glacial (33%) para nova leitura a 570nm. A eficácia da remoção do biofilme pelos sanitizantes foi comparada pelo índice de formação de biofilme (IFB). As imagens do aço inoxidável após tratamento com sanitizante foram feitas através de Microscopia Eletrônica de Varredura (MEV) e Microscopia Confocal, para visualizar a persistência dos biofilmes. Os valores de CIM (diluição 1:2) mostraram que OLE não teve atividade bactericida. No método DDA, L. monocytogenes, foi resistente à OLE, enquanto E. coli e S. aureus apresentaram resistência intermediária. Os sanitizantes comerciais apresentaram boa atividade bactericida nos ensaios de CIM e DDA, sendo que as associações de OLE aos sanitizantes comerciais aumentaram o efeito germicida. Nos ensaios com biofilmes em monoespécie, somente os sanitizantes comerciais, isolados ou associados com OLE, foram eficazes de reduzir o valor de BFI em microplaca de poliestireno. Em biofilmes multiespécie, OLE apresentou efeito antimicrobiano, sobretudo sobre a associação de L. monocytogenes + E. coli + S. aureus (redução: 91,49%). Nenhum dos compostos avaliados foi capaz de inativar completamente os biofilmes nas superfícies de aço inoxidável, uma vez que células viáveis foram observadas após os tratamentos com os sanitizantes, indicando persistência dos biofilmes. Os resultados indicam que a oleuropeína apresentou potencial para incrementar o efeito bactericida de sanitizantes comerciais para eliminação de biofilmes em superfícies inertes, sendo necessários estudos para compreender os mecanismos de ação dessas combinações.
Resumo:
The issue: The European Union's emissions trading system (ETS), introduced in 2005, is the centerpiece of EU decarbonisation efforts and the biggest emissions trading scheme in the world. After a peak in May 2008, the price of ETS carbon allowances started to collapse, and industry, civil society and policymakers began to think about how to ‘repair the ETS’. However, the ETS is an effective and efficient tool to mitigate greenhouse gas emissions, and although prices have not been stable, it has evolved to cover more sectors and greenhouse gases, and to become more robust and less distorting. Prices are depressed because of an interplay of fundamental factors and a lack of confidence in the system. Policy challenge The ETS must be stabilised by reinforcing the credibility of the system so that the use of existing low-carbon alternatives (for example burning gas instead of coal) is incentivised and investment in low-carbon assets is ensured. Further-more, failure to reinvigorate the ETS might compromise the cost-effective synchronisation of European decarbonisation efforts across sectors and countries. To restore credibility and to ensure long-term commitment to the ETS, the European Investment Bank should auction guarantees on the future emission allowance price.This will reduce the risk for low-carbon investments and enable stabilisation of the ETS until a compromise is found on structural measures to reinforce it in order to achieve the EU's long-term decarbonisation targets.
Resumo:
Objectives Queensland, the north-eastern state of Australia, has the highest incidence of melanoma in the world. Control measures started earlier here than probably anywhere else in the world; early detection programmes started in the 1960s and primary prevention in the 1980s. Data from the population-based Queensland Cancer Registry therefore provide an internationally unique data source with which to assess trends for in situ and invasive melanomas and to consider the implications for early detection and primary prevention. Methods We used Poisson regression to estimate the annual percentage change in rates across 21 years of incidence data for in situ and invasive lesions, stratified by age and sex. Joinpoint analyses were used to assess whether there had been a statistically significant change in the trends. Results In situ melanomas increased by 10.4% (95% CI: 10.1%, 11.1%) per year among males and 8.4% (7.9%, 8.9%) per year among females. The incidence of invasive lesions also increased, but not as quickly; males 2.6% (2.4%, 2.8%), females 1.2% (0.9%, 1.5%). Valid data on thickness was only available for 1991 to 2002 and for this period thin-invasive lesions were increasing faster than thick-invasive lesions (for example, among males: thin 3.8%, thick 2.0%). We found some suggestive evidence of lower proportionate increase for the most recent years for both in-situ and invasive lesions, but this did not achieve statistical significance. Among people younger than 35 years, the incidence of invasive melanoma was stable and there was a suggestion of a birth cohort effect from about 1958. Mortality rates were stable across all ages, and there was a suggestion of decreasing rates among young women, although this did not achieve statistical significance. Conclusion Age-standardised incidence is continuing to increase and this, in combination with a shift to proportionately more in situ lesions, suggests that the stabilisation of mortality rates is due, in large part, to earlier detection. For primary prevention, after a substantial period of sustained effort in Queensland, there is some suggestive, but not definitive, evidence that progress is being made. Incidence rates are stabilising in those younger than 35 years and the proportionate increase for both in situ and invasive lesions appears to be lower for the most recent period compared with previous periods. However, even taking the most favourable view of these trends, primary prevention is unlikely to lead to decreases in the overall incidence rate of melanoma for at least another 20 years. Consequently, the challenge for primary prevention programmes will be to maintain momentum over the long term. If this can be achieved, the eventual public-health benefits are likely to be substantial.
Resumo:
This thesis examines experimentally options for optical fibre transmission over oceanic distances. Its format follows the chronological evolution of ultra-long haul optical systems, commencing with opto-electronic regenerators as repeaters, progressing to optically amplified NRZ systems and finally solitonic propagation. In each case recirculating loop techniques are deployed to simplify the transmission experiments. Advances in high speed electronics have allowed regenerators operating at 10 Gbit/s to become a practical reality. By augmenting such devices with optical amplifiers it is possible to greatly enhance the repeater spacing. Work detailed in this thesis has culminated in the propagation of 10 Gbit/s data over 400,000 km with a repeater spacing of 160 km. System reliability and robustness are enhanced by the use of a directly modulated DFB laser transmitter and total insensitivity of the system to the signal state of polarisation. Optically amplified ultra-long haul NRZ systems have taken on particular importance with the impending deployment of TAT 12/13 and TPC 5. The performance of these systems is demonstrated to be primarily limited by analogue impairments such as the accumulation of amplifier noise, polarisation effects and optical non-linearities. These degradations may be reduced by the use of appropriate dispersion maps and by scrambling the transmitted state of signal polarisation. A novel high speed optically passive polarisation scrambler is detailed for the first time. At bit rates in excess of 10 Gbit/s it is shown that these systems are severely limited and do not offer the advantages that might be expected over regenerated links. Propagation using solitons as the data bits appears particularly attractive since the dispersive and non-linear effects of the fibre allow distortion free transmission. However, the generation of pure solitons is difficult but must be achieved if the uncontrolled transmission distance is to be maximised. This thesis presents a new technique for the stabilisation of an erbium fibre ring laser that has aUowed propagation of 2.5 Gbit/s solitons to the theoretical limit of ~ 18,000 km. At higher bit rates temporal jitter becomes a significant impairment and to aUow an increase in the aggregate line rate multiplexing in both time and polarisation domains has been proposed. These techniques are shown to be of only limited benefit in practical systems and ultimately some form of soliton transmission control is required. The thesis demonstrates synchronous retiming by amplitude modulation that has allowed 20 Gbit/s data to propagate 125,000 km error free with an amplifier spacing approaching the soliton period. Ultimately the speed of operation of such systems is limited by the electronics used and, thus, a new form of soliton control is demonstrated using all optical techniques to achieve synchronous phase modulation.
Resumo:
This collection of papers records a series of studies, carried out over a period of some 50 years, on two aspects of river pollution control - the prevention of pollution by sewage biological filtration and the monitoring of river pollution by biological surveillance. The earlier studies were carried out to develop methods of controlling flies which bred in the filters and caused serious nuisance and possible public health hazard, when they dispersed to surrounding villages. Although the application of insecticides proved effective as an alleviate measure, because it resulted in only a temporary disturbance of the ecological balance, it was considered ecologically unsound as a long-term solution. Subsequent investigations showed that the fly populations in filters were largely determined by the amount of food available to the grazing larval stage in the form of filter film. It was also established that the winter deterioration in filter performance was due to the excessive accumulation of film. Subsequent investigations were therefore carried out to determine the factors responsible for the accumulation of film in different types of filter. Methods of filtration which were considered to control film accumulation by increasing the flushing action of the sewage, were found to control fungal film by creating nutrient limiting conditions. In some filters increasing the hydraulic flushing reduced the grazing fauna population in the surface layers and resulted in an increase in film. The results of these investigations were successfully applied in modifying filters and in the design of a Double Filtration process. These studies on biological filters lead to the conclusion that they should be designed and operated as ecological systems and not merely as hydraulic ones. Studies on the effects of sewage effluents on Birmingham streams confirmed the findings of earlier workers justifying their claim for using biological methods for detecting and assessing river pollution. Further ecological studies showed the sensitivity of benthic riffle communities to organic pollution. Using experimental channels and laboratory studies the different environmental conditions associated with organic pollution were investigated. The degree and duration of the oxygen depletion during the dark hours were found to be a critical factor. The relative tolerance of different taxa to other pollutants, such as ammonia, differed. Although colonisation samplers proved of value in sampling difficult sites, the invertebrate data generated were not suitable for processing as any of the commonly used biotic indexes. Several of the papers, which were written by request for presentation at conferences etc., presented the biological viewpoint on river pollution and water quality issues at the time and advocated the use of biological methods. The information and experiences gained in these investigations was used as the "domain expert" in the development of artificial intelligence systems for use in the biological surveillance of river water quality.
Resumo:
FULL TEXT: Like many people one of my favourite pastimes over the holiday season is to watch the great movies that are offered on the television channels and new releases in the movie theatres or catching up on those DVDs that you have been wanting to watch all year. Recently we had the new ‘Star Wars’ movie, ‘The Force Awakens’, which is reckoned to become the highest grossing movie of all time, and the latest offering from James Bond, ‘Spectre’ (which included, for the car aficionados amongst you, the gorgeous new Aston Martin DB10). It is always amusing to see how vision correction or eye injury is dealt with by movie makers. Spy movies and science fiction movies have a freehand to design aliens with multiples eyes on stalks or retina scanning door locks or goggles that can see through walls. Eye surgery is usually shown in some kind of day case simplified laser treatment that gives instant results, apart from the great scene in the original ‘Terminator’ movie where Arnold Schwarzenegger's android character encounters an injury to one eye and then proceeds to remove the humanoid covering to this mechanical eye over a bathroom sink. I suppose it is much more difficult to try and include contact lenses in such movies. Although you may recall the film ‘Charlie's Angels’, which did have a scene where one of the Angels wore a contact lens that had a retinal image imprinted on it so she could by-pass a retinal scan door lock and an Eddy Murphy spy movie ‘I-Spy’, where he wore contact lenses that had electronic gadgetry that allowed whatever he was looking at to be beamed back to someone else, a kind of remote video camera device. Maybe we aren’t quite there in terms of devices available but these things are probably not the behest of science fiction anymore as the technology does exist to put these things together. The technology to incorporate electronics into contact lenses is being developed and I am sure we will be reporting on it in the near future. In the meantime we can continue to enjoy the unrealistic scenes of eye swapping as in the film ‘Minority Report’ (with Tom Cruise). Much more closely to home, than in a galaxy far far away, in this issue you can find articles on topics much nearer to the closer future. More and more optometrists in the UK are becoming registered for therapeutic work as independent prescribers and the number is likely to rise in the near future. These practitioners will be interested in the review paper by Michael Doughty, who is a member of the CLAE editorial panel (soon to be renamed the Jedi Council!), on prescribing drugs as part of the management of chronic meibomian gland dysfunction. Contact lenses play an active role in myopia control and orthokeratology has been used not only to help provide refractive correction but also in the retardation of myopia. In this issue there are three articles related to this topic. Firstly, an excellent paper looking at the link between higher spherical equivalent refractive errors and the association with slower axial elongation. Secondly, a paper that discusses the effectiveness and safety of overnight orthokeratology with high-permeability lens material. Finally, a paper that looks at the stabilisation of early adult-onset myopia. Whilst we are always eager for new and exciting developments in contact lenses and related instrumentation in this issue of CLAE there is a demonstration of a novel and practical use of a smartphone to assisted anterior segment imaging and suggestions of this may be used in telemedicine. It is not hard to imagine someone taking an image remotely and transmitting that back to a central diagnostic centre with the relevant expertise housed in one place where the information can be interpreted and instruction given back to the remote site. Back to ‘Star Wars’ and you will recall in the film ‘The Phantom Menace’ when Qui-Gon Jinn first meets Anakin Skywalker on Tatooine he takes a sample of his blood and sends a scan of it back to Obi-Wan Kenobi to send for analysis and they find that the boy has the highest midichlorian count ever seen. On behalf of the CLAE Editorial board (or Jedi Council) and the BCLA Council (the Senate of the Republic) we wish for you a great 2016 and ‘may the contact lens force be with you’. Or let me put that another way ‘the CLAE Editorial Board and BCLA Council, on behalf of, a great 2016, we wish for you!’