14 resultados para OCCAM
Resumo:
Projecte de recerca elaborat a partir d’una estada a la National Oceanography Centre of Southampton (NOCS), Gran Bretanya, entre maig i juliol del 2006. La possibilitat d’obtenir una estimació precissa de la salinitat marina (SSS) és important per a investigar i predir l’extensió del fenòmen del canvi climàtic. La missió Soil Moisture and Ocean Salinity (SMOS) va ser seleccionada per l’Agència Espacial Europea (ESA) per a obtenir mapes de salinitat de la superfície marina a escala global i amb un temps de revisita petit. Abans del llençament de SMOS es preveu l’anàlisi de la variabilitat horitzontal de la SSS i del potencial de les dades recuperades a partir de mesures de SMOS per a reproduir comportaments oceanogràfics coneguts. L’objectiu de tot plegat és emplenar el buit existent entre les fonts de dades d’entrada/auxiliars fiables i les eines desenvolupades per a simular i processar les dades adquirides segons la configuració de SMOS. El SMOS End-to-end Performance Simulator (SEPS) és un simulador adhoc desenvolupat per la Universitat Politècnica de Catalunya (UPC) per a generar dades segons la configuració de SMOS. Es va utilitzar dades d’entrada a SEPS procedents del projecte Ocean Circulation and Climate Advanced Modeling (OCCAM), utilitzat al NOCS, a diferents resolucions espacials. Modificant SEPS per a poder fer servir com a entrada les dades OCCAM es van obtenir dades de temperatura de brillantor simulades durant un mes amb diferents observacions ascendents que cobrien la zona seleccionada. Les tasques realitzades durant l’estada a NOCS tenien la finalitat de proporcionar una tècnica fiable per a realitzar la calibració externa i per tant cancel•lar el bias, una metodologia per a promitjar temporalment les diferents adquisicions durant les observacions ascendents, i determinar la millor configuració de la funció de cost abans d’explotar i investigar les posibiltats de les dades SEPS/OCCAM per a derivar la SSS recuperada amb patrons d’alta resolució.
Resumo:
BACKGROUND: Since the introduction of the endoscopic endonasal approaches in the field of skull base surgery during the last two decades, several variants of the sella turcica endoscopic surgery have been described. The aim of this study is to provide a stepwise description of one of these variants in a minimally invasive/maximally efficient perspective. METHOD: For the majority of our sella turcica pathologies, we have progressively adopted a uninostril endoscopic approach that is very conservative towards the nasal mucosa with a very limited mucosal incision, resection of the vomer and allowing an almost ad integrum sellar floor reconstruction, without compromising the efficacy and completeness of both surgical oncologic and endocrine targets. CONCLUSION: The uninostril trans-sphenoidal endoscopic endonasal approach to sella turcica is tailored to ally maximal efficiency and minimal invasiveness.
Resumo:
Mode of access: Internet.
Resumo:
"Bibliographie": v.1, p.[273]-306.
Resumo:
El principio radical de toda la filosofía de Ockham no es ni una metafísica del singular, ni una actitud logicista, teologal oescéptica. El Inceptor es la culminación de una tendencia neoplatónica, también central en el Eriúgena: aun para la creatura humanalo "inteligente" es anterior y ontológicamente superior a lo "inteligible" o "sensible".Heredado de Agustín de Hipona, pero transformado en el análisis hasta ser inmediato precedente de la Modernidad o comienzo de ella, el "yo pienso" es el quicio del sistema de Ockman. De la exacta acotación moderna de "espíritu" arranca todo el sistema; a partir de él se justifican incluso los rasgos que más sorprende hallar en un filósofo del siglo XIV: la centralidad de las nociones opuestas "intuición/evidencia", la plena actividad cognoscitiva del espíritu junto a la pasividad del intelecto, la nula causalidad próxima de las cosas sensibles en el conocimiento humano, e! paradigma perfectamente lingüístico del conocer. El nacimiento de una moderna "teoría del conocimiento", se da a partir de la constatación de que el acto más perfecto y paradigmáticode la vida del espíritu es la intuición de los "intellectualia" que radican en él mismo, y no la captación de los externos "sensibilia" aristotélicos.
Resumo:
A novel two-stage construction algorithm for linear-in-the-parameters classifier is proposed, aiming at noisy two-class classification problems. The purpose of the first stage is to produce a prefiltered signal that is used as the desired output for the second stage to construct a sparse linear-in-the-parameters classifier. For the first stage learning of generating the prefiltered signal, a two-level algorithm is introduced to maximise the model's generalisation capability, in which an elastic net model identification algorithm using singular value decomposition is employed at the lower level while the two regularisation parameters are selected by maximising the Bayesian evidence using a particle swarm optimization algorithm. Analysis is provided to demonstrate how “Occam's razor” is embodied in this approach. The second stage of sparse classifier construction is based on an orthogonal forward regression with the D-optimality algorithm. Extensive experimental results demonstrate that the proposed approach is effective and yields competitive results for noisy data sets.
Resumo:
Utilizando-se dados magnetotelúricos (MT), foi obtida uma imagem geo-elétrica nítida da região do Juruá, Bacia do Solimões, na forma de seções geo-elétricas. Os dados de campo foram registrados ao longo de três linhas de 15 km, espaçadas de 3.5 km, recobrindo uma área de 100 km2. O espaçamento entre as 35 estações é irregular, variando de 400 m a 3500 m. A faixa de freqüências utilizada cobriu de 0.001 Hz até 300 Hz, o que permitiu investigar de 100 m até 60 km de profundidade. Os dados apresentam-se afetados pelo efeito de distorção estática. Para corrigir este efeito foi utilizada a mediana da resistividade do primeiro condutor, correspondente à Formação Solimões. Foi utilizado o invariante do tensor MT para interpretar a estrutura geo-elétrica do Juruá. As seções geo-elétricas foram obtidas a partir do agrupamento dos dados resultantes da transformação de Bostick e da inversão 1D de Occam, para cada estação. Foi identificada uma seqüência de camadas condutivas e resistivas, correspondentes ao pacote sedimentar, uma zona de falhas e o topo do embasamento geo-elétrico, caracterizando a Bacia do Solimões. Abaixo do embasamento geo-elétrico foram também identificados uma zona condutora, seguida por uma camada de baixa condutividade, a profundidades iguais ou superiores a 20 km. Esta camada é interpretada como sendo de composição de gabro, estando associada a processos de acreção vertical, intimamente ligados à estabilização crustal e espessamento da litosfera. Os resultados apresentam uma boa concordância com os perfis de resistividade de poços e dados sísmicos de superfície.
Resumo:
A collection of miscellaneous pamphlets on religion.
Resumo:
Using current software engineering technology, the robustness required for safety critical software is not assurable. However, different approaches are possible which can help to assure software robustness to some extent. For achieving high reliability software, methods should be adopted which avoid introducing faults (fault avoidance); then testing should be carried out to identify any faults which persist (error removal). Finally, techniques should be used which allow any undetected faults to be tolerated (fault tolerance). The verification of correctness in system design specification and performance analysis of the model, are the basic issues in concurrent systems. In this context, modeling distributed concurrent software is one of the most important activities in the software life cycle, and communication analysis is a primary consideration to achieve reliability and safety. By and large fault avoidance requires human analysis which is error prone; by reducing human involvement in the tedious aspect of modelling and analysis of the software it is hoped that fewer faults will persist into its implementation in the real-time environment. The Occam language supports concurrent programming and is a language where interprocess interaction takes place by communications. This may lead to deadlock due to communication failure. Proper systematic methods must be adopted in the design of concurrent software for distributed computing systems if the communication structure is to be free of pathologies, such as deadlock. The objective of this thesis is to provide a design environment which ensures that processes are free from deadlock. A software tool was designed and used to facilitate the production of fault-tolerant software for distributed concurrent systems. Where Occam is used as a design language then state space methods, such as Petri-nets, can be used in analysis and simulation to determine the dynamic behaviour of the software, and to identify structures which may be prone to deadlock so that they may be eliminated from the design before the program is ever run. This design software tool consists of two parts. One takes an input program and translates it into a mathematical model (Petri-net), which is used for modeling and analysis of the concurrent software. The second part is the Petri-net simulator that takes the translated program as its input and starts simulation to generate the reachability tree. The tree identifies `deadlock potential' which the user can explore further. Finally, the software tool has been applied to a number of Occam programs. Two examples were taken to show how the tool works in the early design phase for fault prevention before the program is ever run.
Resumo:
The thesis describes an investigation into methods for the specification, design and implementation of computer control systems for flexible manufacturing machines comprising multiple, independent, electromechanically-driven mechanisms. An analysis is made of the elements of conventional mechanically-coupled machines in order that the operational functions of these elements may be identified. This analysis is used to define the scope of requirements necessary to specify the format, function and operation of a flexible, independently driven mechanism machine. A discussion of how this type of machine can accommodate modern manufacturing needs of high-speed and flexibility is presented. A sequential method of capturing requirements for such machines is detailed based on a hierarchical partitioning of machine requirements from product to independent drive mechanism. A classification of mechanisms using notations, including Data flow diagrams and Petri-nets, is described which supports capture and allows validation of requirements. A generic design for a modular, IDM machine controller is derived based upon hierarchy of control identified in these machines. A two mechanism experimental machine is detailed which is used to demonstrate the application of the specification, design and implementation techniques. A computer controller prototype and a fully flexible implementation for the IDM machine, based on Petri-net models described using the concurrent programming language Occam, is detailed. The ability of this modular computer controller to support flexible, safe and fault-tolerant operation of the two intermittent motion, discrete-synchronisation independent drive mechanisms is presented. The application of the machine development methodology to industrial projects is established.
Resumo:
Requirements for systems to continue to operate satisfactorily in the presence of faults has led to the development of techniques for the construction of fault tolerant software. This thesis addresses the problem of error detection and recovery in distributed systems which consist of a set of communicating sequential processes. A method is presented for the `a priori' design of conversations for this class of distributed system. Petri nets are used to represent the state and to solve state reachability problems for concurrent systems. The dynamic behaviour of the system can be characterised by a state-change table derived from the state reachability tree. Systematic conversation generation is possible by defining a closed boundary on any branch of the state-change table. By relating the state-change table to process attributes it ensures all necessary processes are included in the conversation. The method also ensures properly nested conversations. An implementation of the conversation scheme using the concurrent language occam is proposed. The structure of the conversation is defined using the special features of occam. The proposed implementation gives a structure which is independent of the application and is independent of the number of processes involved. Finally, the integrity of inter-process communications is investigated. The basic communication primitives used in message passing systems are seen to have deficiencies when applied to systems with safety implications. Using a Petri net model a boundary for a time-out mechanism is proposed which will increase the integrity of a system which involves inter-process communications.
Resumo:
Image segmentation is one of the most computationally intensive operations in image processing and computer vision. This is because a large volume of data is involved and many different features have to be extracted from the image data. This thesis is concerned with the investigation of practical issues related to the implementation of several classes of image segmentation algorithms on parallel architectures. The Transputer is used as the basic building block of hardware architectures and Occam is used as the programming language. The segmentation methods chosen for implementation are convolution, for edge-based segmentation; the Split and Merge algorithm for segmenting non-textured regions; and the Granlund method for segmentation of textured images. Three different convolution methods have been implemented. The direct method of convolution, carried out in the spatial domain, uses the array architecture. The other two methods, based on convolution in the frequency domain, require the use of the two-dimensional Fourier transform. Parallel implementations of two different Fast Fourier Transform algorithms have been developed, incorporating original solutions. For the Row-Column method the array architecture has been adopted, and for the Vector-Radix method, the pyramid architecture. The texture segmentation algorithm, for which a system-level design is given, demonstrates a further application of the Vector-Radix Fourier transform. A novel concurrent version of the quad-tree based Split and Merge algorithm has been implemented on the pyramid architecture. The performance of the developed parallel implementations is analysed. Many of the obtained speed-up and efficiency measures show values close to their respective theoretical maxima. Where appropriate comparisons are drawn between different implementations. The thesis concludes with comments on general issues related to the use of the Transputer system as a development tool for image processing applications; and on the issues related to the engineering of concurrent image processing applications.
Resumo:
O crescimento da demanda energética, prevista para a metade do século XXI, com números embasados no crescimento demográfico e de consumo dos países em desenvol- vimento, sugere a busca por fontes energéticas renováveis e de menor impacto ao meio ambiente, conforme os tratados da política internacional. Portanto, o fornecimento de energia suplementar se torna vital nas sociedades modernas e sua extensão até o mar tem se constituído uma recente preocupação do ponto de vista enérgico e ecológico. Várias formas de conversão de energia foram desenvolvidas no decorrer dos anos, com destaque para a energia dos gradientes térmicos. A Plataforma Continental Sul do Bra- sil (PSCB) possui alta variabilidade espacial e temporal nos campos de temperatura, de forma que existe a necessidade de uma análise das regiões de maior potencial energético com respeito ao gradiente vertical de temperatura. Neste estudo, foram utilizados dados do modelo OCCAM com uma grade de resolu- ção horizontal de 0, 25o e resolução vertical de 66 níveis, distribuídos ao longo de um sistema de coordenadas vertical. Foram utilizadas imagens de temperatura superfícial do mar (TSM) obtidas a partir do sensor AVHRR (Advanced Very High Resolution Ra- diometer) de forma a realizar a validação dos dados do modelo OCCAM. A análise da média dos dados do modelo indicou um sítio energético de maior viabilidade devido oC ao padrão médio do gradiente térmico de aproximadamente 0, 17 ao longo da coluna vertical (545 m de profundidade) no oceano. Neste local, foram coletados os dados, e aplicados a um módulo de conversão de energia térmica dos oceanos que vem sendo desenvolvido na Universidade Federal do Rio Grande - FURG. A região de estudo de- monstrou possuir um local com ótimo potencial energético, onde a produção máxima de energia pode alcançar 111, 9MW , associada com um padrão variabilidade tempo- ral dominante de 12 meses. Este sítio energético demonstra maior eficiência durante o período de verão e outono ao longo dos anos e sua média para todo o período é de 94, 3MW . Neste estudo, duas correntes: Corrente do Brasil (CB) e a Contra Corrente Costeira (CCC), com águas de origem tropical e subantártica com aportes continentais, respecti- vamente, tem alta correlação com os valores dos gradientes térmicos e com os significa- tivos eventos de conversão energética. O sítio energético demonstrou alta estabilidade à sazonalidade e à gama de eventos meteorológicos e oceanográficos, de forma que pode ser qualificado como uma fonte suplementar a matriz energética do país para um futuro próximo.