911 resultados para CIDOC Conceptual Reference Model
Resumo:
Este artículo de investigación científica y tecnológica estudia la percepción de seguridad en el uso de puentes peatonales, empleando un enfoque sustentado en dos campos principales: el microeconómico y el psicológico. El trabajo hace la estimación simultánea de un modelo híbrido de elección y variables latentes con datos de una encuesta de preferencias declaradas, encontrando mejor ajuste que un modelo mixto de referencia, lo que indica que la percepción de seguridad determina el comportamiento de los peatones cuando se enfrentan a la decisión de usar o no un puente peatonal. Se encontró que el sexo, la edad y el nivel de estudios son atributos que inciden en la percepción de seguridad. El modelo calibrado sugiere varias estrategias para aumentar el uso de puentes peatonales que son discutidas, encontrando que el uso de barreras ocasiona una pérdida de utilidad, en los peatones, que debería ser estudiada como extensión del presente trabajo.
Resumo:
A modelação matemática de Estações de Tratamento de Águas Residuais (ETAR) tem sido uma ferramenta de enorme utilidade nas fases de projeto e de exploração destas estruturas de tratamento. O presente estudo teve por objetivo principal construir um modelo matemático da ETAR de Bragança, em particular do seu tratamento biológico de lamas ativadas, com vista a avaliar, compreender e otimizar o seu desempenho. A construção do modelo foi efetuada com recurso ao ambiente de simulação WRc STOAT 5.0. O processo de lamas ativadas foi descrito pelo modelo de referência ASAL3. O modelo construído foi calibrado e validado com base em dados experimentais de 2015, obtidos no âmbito do programa de controlo analítico da ETAR. O modelo foi ainda utilizado para avaliar a qualidade do efluente em resposta a alterações do caudal e composição do afluente, a alterações de condições operacionais e a outras alternativas de tratamento. O modelo mostrou-se bastante adequado na descrição da evolução mensal da qualidade do efluente final da ETAR relativamente aos parâmetros Sólidos Suspensos Totais (SST) e Carência Bioquímica de Oxigénio (CBO5), embora apresente uma tendência para os subestimar em 1,5 e 3,5 mg/L, respetivamente. Em relação ao azoto total, os valores simulados aproximaram-se dos valores reais, quando se aumentaram as taxas de recirculação interna para 400%, um fator de cerca de 4 vezes superior. Os resultados do modelo e dos cenários mostram e reforçam o bom desempenho e a operação otimizada da ETAR em relação a remoção de SST e CBO5. Em relação ao azoto total, a ETAR não assegura de forma sistemática uma eficiência elevada, mas apresenta um bom desempenho, face ao que o modelo consegue explicar para as mesmas condições operacionais. Através do estudo de cenários procurou-se encontrar alternativas de tratamento eficientes e viáveis de remoção de azoto total, mas não se identificaram soluções que assegurassem decargas de azoto abaixo dos limites legais. Os melhores resultados que se alcançaram para a remoção deste contaminante estão associados ao aumento das taxas de recirculação interna do sistema pré-anóxico existente e a uma configuração do tipo Bardenpho de quatro estágios com alimentação distribuída, em proporções iguais, pelos dois estágios anóxicos. Outras soluções que envolvam tecnologias distintas podem e devem ser equacionadas em projetos futuros que visem a melhoria de eficiência de remoção de azoto da ETAR.
Resumo:
Dissertação (mestrado)—Universidade de Brasília, Faculdade de Tecnologia, Departamento de Engenharia Elétrica, 2015.
Resumo:
Dissertação (mestrado)—Universidade de Brasília, Faculdade de Tecnologia, Departamento de Engenharia Civil e Ambiental, 2015.
Resumo:
The proliferation of new mobile communication devices, such as smartphones and tablets, has led to an exponential growth in network traffic. The demand for supporting the fast-growing consumer data rates urges the wireless service providers and researchers to seek a new efficient radio access technology, which is the so-called 5G technology, beyond what current 4G LTE can provide. On the other hand, ubiquitous RFID tags, sensors, actuators, mobile phones and etc. cut across many areas of modern-day living, which offers the ability to measure, infer and understand the environmental indicators. The proliferation of these devices creates the term of the Internet of Things (IoT). For the researchers and engineers in the field of wireless communication, the exploration of new effective techniques to support 5G communication and the IoT becomes an urgent task, which not only leads to fruitful research but also enhance the quality of our everyday life. Massive MIMO, which has shown the great potential in improving the achievable rate with a very large number of antennas, has become a popular candidate. However, the requirement of deploying a large number of antennas at the base station may not be feasible in indoor scenarios. Does there exist a good alternative that can achieve similar system performance to massive MIMO for indoor environment? In this dissertation, we address this question by proposing the time-reversal technique as a counterpart of massive MIMO in indoor scenario with the massive multipath effect. It is well known that radio signals will experience many multipaths due to the reflection from various scatters, especially in indoor environments. The traditional TR waveform is able to create a focusing effect at the intended receiver with very low transmitter complexity in a severe multipath channel. TR's focusing effect is in essence a spatial-temporal resonance effect that brings all the multipaths to arrive at a particular location at a specific moment. We show that by using time-reversal signal processing, with a sufficiently large bandwidth, one can harvest the massive multipaths naturally existing in a rich-scattering environment to form a large number of virtual antennas and achieve the desired massive multipath effect with a single antenna. Further, we explore the optimal bandwidth for TR system to achieve maximal spectral efficiency. Through evaluating the spectral efficiency, the optimal bandwidth for TR system is found determined by the system parameters, e.g., the number of users and backoff factor, instead of the waveform types. Moreover, we investigate the tradeoff between complexity and performance through establishing a generalized relationship between the system performance and waveform quantization in a practical communication system. It is shown that a 4-bit quantized waveforms can be used to achieve the similar bit-error-rate compared to the TR system with perfect precision waveforms. Besides 5G technology, Internet of Things (IoT) is another terminology that recently attracts more and more attention from both academia and industry. In the second part of this dissertation, the heterogeneity issue within the IoT is explored. One of the significant heterogeneity considering the massive amount of devices in the IoT is the device heterogeneity, i.e., the heterogeneous bandwidths and associated radio-frequency (RF) components. The traditional middleware techniques result in the fragmentation of the whole network, hampering the objects interoperability and slowing down the development of a unified reference model for the IoT. We propose a novel TR-based heterogeneous system, which can address the bandwidth heterogeneity and maintain the benefit of TR at the same time. The increase of complexity in the proposed system lies in the digital processing at the access point (AP), instead of at the devices' ends, which can be easily handled with more powerful digital signal processor (DSP). Meanwhile, the complexity of the terminal devices stays low and therefore satisfies the low-complexity and scalability requirement of the IoT. Since there is no middleware in the proposed scheme and the additional physical layer complexity concentrates on the AP side, the proposed heterogeneous TR system better satisfies the low-complexity and energy-efficiency requirement for the terminal devices (TDs) compared with the middleware approach.
Resumo:
Scientific curiosity, exploration of georesources and environmental concerns are pushing the geoscientific research community toward subsurface investigations of ever-increasing complexity. This review explores various approaches to formulate and solve inverse problems in ways that effectively integrate geological concepts with geophysical and hydrogeological data. Modern geostatistical simulation algorithms can produce multiple subsurface realizations that are in agreement with conceptual geological models and statistical rock physics can be used to map these realizations into physical properties that are sensed by the geophysical or hydrogeological data. The inverse problem consists of finding one or an ensemble of such subsurface realizations that are in agreement with the data. The most general inversion frameworks are presently often computationally intractable when applied to large-scale problems and it is necessary to better understand the implications of simplifying (1) the conceptual geological model (e.g., using model compression); (2) the physical forward problem (e.g., using proxy models); and (3) the algorithm used to solve the inverse problem (e.g., Markov chain Monte Carlo or local optimization methods) to reach practical and robust solutions given today's computer resources and knowledge. We also highlight the need to not only use geophysical and hydrogeological data for parameter estimation purposes, but also to use them to falsify or corroborate alternative geological scenarios.
Resumo:
Questa tesi propone un indagine sulla memoria del retorno a partire da una prospettiva critica che assume il “sud globale di lingua portoghese” come spazio storico e concettuale di riferimento. Si riflette sull'idea di specificità attribuita alla colonizzazione promossa dal Portogallo in Africa tenendo conto delle contraddizioni associate al movimento migratorio innescato dal processo violento di decolonizzazione dell’Africa portoghese. Le memorie trauamatiche sul retorno espongono la violenza come componente costitutiva della realtà coloniale ma ripropongono anche dinamiche che permettono l’occultamento del razzismo. L'esplorazione della “soffitta”, assunta come metafora della memoria familiare custodita nello spazio domestico, accompagna quella dell’archivio pubblico. L’analisi dell’archivio ufficiale e della memoria familiare riflette il tentativo di stabilire un dialogo tra storia e memoria superando la logica di antitesi che tradizionalmente le contrappone. Utilizzando il concetto criticamente problematico di “postmemoria” si riflette sulla riconfigurazione del rapporto con il passato in funzione di un’idea di “eredità come compito” assunto nel presente). La possibilità di “salvare” il passato dalla progressiva scomparsa dei testimoni comporta un pericolo di abuso ideologico connaturato al processo di trasmissione. La traduzione delle memorie coloniali sul retorno dallo spazio intimo allo spazio del dibattito pubblico mostra la relazione tra la costruzione della mitologia familiare e l’adozione del discorso lusotropicale. Il tentativo di definire la natura indecifrabile del retornado comporta la possibilità di sanzionare la violenza coloniale negando una responsabilità collettiva riferita al colonialismo. Si presenta il tentativo di configurare i termini di una questione post-coloniale portoghese dai contorni opachi. Questa tesi approda ad una conclusione aperta, articolata sul rischio sempre presente di appropriazione delle categorie critiche post-coloniali da parte dell’ideologia egemonica. Attraverso le (post)memorie (post)coloniali la denuncia del razzismo in quanto eredità permanente e la riconfigurazione dell’archivio coloniale costituiscono operazioni possibili, necessarie, ma non per questo scontate o prive di rischi.
Resumo:
Roughly fifteen years ago, the Church of Jesus Christ of Latter-day Saints published a new proposed standard file format. They call it GEDCOM. It was designed to allow different genealogy programs to exchange data.Five years later, in may 2000, appeared the GENTECH Data Modeling Project, with the support of the Federation of Genealogical Societies (FGS) and other American genealogical societies. They attempted to define a genealogical logic data model to facilitate data exchange between different genealogical programs. Although genealogists deal with an enormous variety of data sources, one of the central concepts of this data model was that all genealogical data could be broken down into a series of short, formal genealogical statements. It was something more versatile than only export/import data records on a predefined fields. This project was finally absorbed in 2004 by the National Genealogical Society (NGS).Despite being a genealogical reference in many applications, these models have serious drawbacks to adapt to different cultural and social environments. At the present time we have no formal proposal for a recognized standard to represent the family domain.Here we propose an alternative conceptual model, largely inherited from aforementioned models. The design is intended to overcome their limitations. However, its major innovation lies in applying the ontological paradigm when modeling statements and entities.
Resumo:
Purpose : The purpose of this article is to critically review the literature to examine factors that are most consistently related to employment outcome following traumatic brain injury (TBI), with a particular focus on metacognitive skills. It also aims to develop a conceptual model of factors related to employment outcome. Method : The first stage of the review considered 85 studies published between 1980 and December 2003 which investigated factors associated with employment outcome following TBI. English-language studies were identified through searches of Medline and PsycINFO, as well as manual searches of journals and reference lists. The studies were evaluated and rated by two independent raters (Kappa = 0.835) according to the quality of their methodology based upon nine criteria. Fifty studies met the criteria for inclusion in the second stage of the review, which examined the relationship between a broad range of variables and employment outcome. Results : The factors most consistently associated with employment outcome included pre-injury occupational status, functional status at discharge, global cognitive functioning, perceptual ability, executive functioning, involvement in vocational rehabilitation services and emotional status. Conclusions : A conceptual model is presented which emphasises the importance of metacognitive, emotional and social environment factors for improving employment outcome.
Resumo:
Large (>1600 mum), ingestively masticated particles of bermuda grass (Cynodon dactylon L. Pers.) leaf and stem labelled with Yb-169 and Ce-144 respectively were inserted into the rumen digesta raft of heifers grazing bermuda grass. The concentration of markers in digesta sampled from the raft and ventral rumen were monitored at regular intervals over approximately 144 h. The data from the two sampling sites were simultaneously fitted to two pool (raft and ventral rumen-reticulum) models with either reversible or sequential flow between the two pools. The sequential flow model fitted the data equally as well as the reversible flow model but the reversible flow model was used because of its greater application. The reversible flow model, hereafter called the raft model, had the following features: a relatively slow age-dependent transfer rate from the raft (means for a gamma 2 distributed rate parameter for leaf 0.0740 v. stem 0.0478 h(-1)), a very slow first order reversible flow from the ventral rumen to the raft (mean for leaf and stem 0.010 h(-1)) and a very rapid first order exit from the ventral rumen (mean of leaf and stem 0.44 h(-1)). The raft was calculated to occupy approximately 0.82 total rumen DM of the raft and ventral rumen pools. Fitting a sequential two pool model or a single exponential model individually to values from each of the two sampling sites yielded similar parameter values for both sites and faster rate parameters for leaf as compared with stem, in agreement with the raft model. These results were interpreted as indicating that the raft forms a large relatively inert pool within the rumen. Particles generated within the raft have difficulty escaping but once into the ventral rumen pool they escape quickly with a low probability of return to the raft. It was concluded that the raft model gave a good interpretation of the data and emphasized escape from and movement within the raft as important components of the residence time of leaf and stem particles within the rumen digesta of cattle.
Resumo:
The majority of the world's population now resides in urban environments and information on the internal composition and dynamics of these environments is essential to enable preservation of certain standards of living. Remotely sensed data, especially the global coverage of moderate spatial resolution satellites such as Landsat, Indian Resource Satellite and Systeme Pour I'Observation de la Terre (SPOT), offer a highly useful data source for mapping the composition of these cities and examining their changes over time. The utility and range of applications for remotely sensed data in urban environments could be improved with a more appropriate conceptual model relating urban environments to the sampling resolutions of imaging sensors and processing routines. Hence, the aim of this work was to take the Vegetation-Impervious surface-Soil (VIS) model of urban composition and match it with the most appropriate image processing methodology to deliver information on VIS composition for urban environments. Several approaches were evaluated for mapping the urban composition of Brisbane city (south-cast Queensland, Australia) using Landsat 5 Thematic Mapper data and 1:5000 aerial photographs. The methods evaluated were: image classification; interpretation of aerial photographs; and constrained linear mixture analysis. Over 900 reference sample points on four transects were extracted from the aerial photographs and used as a basis to check output of the classification and mixture analysis. Distinctive zonations of VIS related to urban composition were found in the per-pixel classification and aggregated air-photo interpretation; however, significant spectral confusion also resulted between classes. In contrast, the VIS fraction images produced from the mixture analysis enabled distinctive densities of commercial, industrial and residential zones within the city to be clearly defined, based on their relative amount of vegetation cover. The soil fraction image served as an index for areas being (re)developed. The logical match of a low (L)-resolution, spectral mixture analysis approach with the moderate spatial resolution image data, ensured the processing model matched the spectrally heterogeneous nature of the urban environments at the scale of Landsat Thematic Mapper data.
Resumo:
Value has been defined in different theoretical contexts as need, desire, interest, standard /criteria, beliefs, attitudes, and preferences. The creation of value is key to any business, and any business activity is about exchanging some tangible and/or intangible good or service and having its value accepted and rewarded by customers or clients, either inside the enterprise or collaborative network or outside. “Perhaps surprising then is that firms often do not know how to define value, or how to measure it” (Anderson and Narus, 1998 cited by [1]). Woodruff echoed that we need “richer customer value theory” for providing an “important tool for locking onto the critical things that managers need to know”. In addition, he emphasized, “we need customer value theory that delves deeply into customer’s world of product use in their situations” [2]. In this sense, we proposed and validated a novel “Conceptual Model for Decomposing the Value for the Customer”. To this end, we were aware that time has a direct impact on customer perceived value, and the suppliers’ and customers’ perceptions change from the pre-purchase to the post-purchase phases, causing some uncertainty and doubts.We wanted to break down value into all its components, as well as every built and used assets (both endogenous and/or exogenous perspectives). This component analysis was then transposed into a mathematical formulation using the Fuzzy Analytic Hierarchy Process (AHP), so that the uncertainty and vagueness of value perceptions could be embedded in this model that relates used and built assets in the tangible and intangible deliverable exchange among the involved parties, with their actual value perceptions.