979 resultados para Psychophysics continuum
Resumo:
In this thesis, we explore three methods for the geometrico-static modelling of continuum parallel robots. Inspired by biological trunks, tentacles and snakes, continuum robot designs can reach confined spaces, manipulate objects in complex environments and conform to curvilinear paths in space. In addition, parallel continuum manipulators have the potential to inherit some of the compactness and compliance of continuum robots while retaining some of the precision, stability and strength of rigid-links parallel robots. Subsequently, the foundation of our work is performed on slender beam by applying the Cosserat rod theory, appropriate to model continuum robots. After that, three different approaches are developed on a case study of a planar parallel continuum robot constituted of two connected flexible links. We solve the forward and inverse geometrico-static problem namely by using (a) shooting methods to obtain a numerical solution, (b) an elliptic method to find a quasi-analytical solution, and (c) the Corde model to perform further model analysis. The performances of each of the studied methods are evaluated and their limits are highlighted. This thesis is divided as follows. Chapter one gives the introduction on the field of the continuum robotics and introduce the parallel continuum robots that is studied in this work. Chapter two describe the geometrico-static problem and gives the mathematical description of this problem. Chapter three explains the numerical approach with the shooting method and chapter four introduce the quasi-analytical solution. Then, Chapter five introduce the analytic method inspired by the Corde model and chapter six gives the conclusions of this work.
Resumo:
The application of modern ICT technologies is radically changing many fields pushing toward more open and dynamic value chains fostering the cooperation and integration of many connected partners, sensors, and devices. As a valuable example, the emerging Smart Tourism field derived from the application of ICT to Tourism so to create richer and more integrated experiences, making them more accessible and sustainable. From a technological viewpoint, a recurring challenge in these decentralized environments is the integration of heterogeneous services and data spanning multiple administrative domains, each possibly applying different security/privacy policies, device and process control mechanisms, service access, and provisioning schemes, etc. The distribution and heterogeneity of those sources exacerbate the complexity in the development of integrating solutions with consequent high effort and costs for partners seeking them. Taking a step towards addressing these issues, we propose APERTO, a decentralized and distributed architecture that aims at facilitating the blending of data and services. At its core, APERTO relies on APERTO FaaS, a Serverless platform allowing fast prototyping of the business logic, lowering the barrier of entry and development costs to newcomers, (zero) fine-grained scaling of resources servicing end-users, and reduced management overhead. APERTO FaaS infrastructure is based on asynchronous and transparent communications between the components of the architecture, allowing the development of optimized solutions that exploit the peculiarities of distributed and heterogeneous environments. In particular, APERTO addresses the provisioning of scalable and cost-efficient mechanisms targeting: i) function composition allowing the definition of complex workloads from simple, ready-to-use functions, enabling smarter management of complex tasks and improved multiplexing capabilities; ii) the creation of end-to-end differentiated QoS slices minimizing interfaces among application/service running on a shared infrastructure; i) an abstraction providing uniform and optimized access to heterogeneous data sources, iv) a decentralized approach for the verification of access rights to resources.
Resumo:
The recent trend of moving Cloud Computing capabilities to the Edge of the network is reshaping how applications and their middleware supports are designed, deployed, and operated. This new model envisions a continuum of virtual resources between the traditional cloud and the network edge, which is potentially more suitable to meet the heterogeneous Quality of Service (QoS) requirements of diverse application domains and next-generation applications. Several classes of advanced Internet of Things (IoT) applications, e.g., in the industrial manufacturing domain, are expected to serve a wide range of applications with heterogeneous QoS requirements and call for QoS management systems to guarantee/control performance indicators, even in the presence of real-world factors such as limited bandwidth and concurrent virtual resource utilization. The present dissertation proposes a comprehensive QoS-aware architecture that addresses the challenges of integrating cloud infrastructure with edge nodes in IoT applications. The architecture provides end-to-end QoS support by incorporating several components for managing physical and virtual resources. The proposed architecture features: i) a multilevel middleware for resolving the convergence between Operational Technology (OT) and Information Technology (IT), ii) an end-to-end QoS management approach compliant with the Time-Sensitive Networking (TSN) standard, iii) new approaches for virtualized network environments, such as running TSN-based applications under Ultra-low Latency (ULL) constraints in virtual and 5G environments, and iv) an accelerated and deterministic container overlay network architecture. Additionally, the QoS-aware architecture includes two novel middlewares: i) a middleware that transparently integrates multiple acceleration technologies in heterogeneous Edge contexts and ii) a QoS-aware middleware for Serverless platforms that leverages coordination of various QoS mechanisms and virtualized Function-as-a-Service (FaaS) invocation stack to manage end-to-end QoS metrics. Finally, all architecture components were tested and evaluated by leveraging realistic testbeds, demonstrating the efficacy of the proposed solutions.
Resumo:
The pervasive availability of connected devices in any industrial and societal sector is pushing for an evolution of the well-established cloud computing model. The emerging paradigm of the cloud continuum embraces this decentralization trend and envisions virtualized computing resources physically located between traditional datacenters and data sources. By totally or partially executing closer to the network edge, applications can have quicker reactions to events, thus enabling advanced forms of automation and intelligence. However, these applications also induce new data-intensive workloads with low-latency constraints that require the adoption of specialized resources, such as high-performance communication options (e.g., RDMA, DPDK, XDP, etc.). Unfortunately, cloud providers still struggle to integrate these options into their infrastructures. That risks undermining the principle of generality that underlies the cloud computing scale economy by forcing developers to tailor their code to low-level APIs, non-standard programming models, and static execution environments. This thesis proposes a novel system architecture to empower cloud platforms across the whole cloud continuum with Network Acceleration as a Service (NAaaS). To provide commodity yet efficient access to acceleration, this architecture defines a layer of agnostic high-performance I/O APIs, exposed to applications and clearly separated from the heterogeneous protocols, interfaces, and hardware devices that implement it. A novel system component embodies this decoupling by offering a set of agnostic OS features to applications: memory management for zero-copy transfers, asynchronous I/O processing, and efficient packet scheduling. This thesis also explores the design space of the possible implementations of this architecture by proposing two reference middleware systems and by adopting them to support interactive use cases in the cloud continuum: a serverless platform and an Industry 4.0 scenario. A detailed discussion and a thorough performance evaluation demonstrate that the proposed architecture is suitable to enable the easy-to-use, flexible integration of modern network acceleration into next-generation cloud platforms.
Resumo:
Continuum parallel robots (CPRs) are manipulators employing multiple flexible beams arranged in parallel and connected to a rigid end-effector. CPRs promise higher payload and accuracy than serial CRs while keeping great flexibility. As the risk of injury during accidental contacts between a human and a CPR should be reduced, CPRs may be used in large-scale collaborative tasks or assisted robotic surgery. There exist various CPR designs, but the prototype conception is rarely based on performance considerations, and the CPRs realization in mainly based on intuitions or rigid-link parallel manipulators architectures. This thesis focuses on the performance analysis of CPRs, and the tools needed for such evaluation, such as workspace computation algorithms. In particular, workspace computation strategies for CPRs are essential for the performance assessment, since the CPRs workspace may be used as a performance index or it can serve for optimal-design tools. Two new workspace computation algorithms are proposed in this manuscript, the former focusing on the workspace volume computation and the certification of its numerical results, while the latter aims at computing the workspace boundary only. Due to the elastic nature of CPRs, a key performance indicator for these robots is the stability of their equilibrium configurations. This thesis proposes the experimental validation of the equilibrium stability assessment on a real prototype, demonstrating limitations of some commonly used assumptions. Additionally, a performance index measuring the distance to instability is originally proposed in this manuscript. Differently from the majority of the existing approaches, the clear advantage of the proposed index is a sound physical meaning; accordingly, the index can be used for a more straightforward performance quantification, and to derive robot specifications.
Resumo:
Recent technological advancements have played a key role in seamlessly integrating cloud, edge, and Internet of Things (IoT) technologies, giving rise to the Cloud-to-Thing Continuum paradigm. This cloud model connects many heterogeneous resources that generate a large amount of data and collaborate to deliver next-generation services. While it has the potential to reshape several application domains, the number of connected entities remarkably broadens the security attack surface. One of the main problems is the lack of security measures to adapt to the dynamic and evolving conditions of the Cloud-To-Thing Continuum. To address this challenge, this dissertation proposes novel adaptable security mechanisms. Adaptable security is the capability of security controls, systems, and protocols to dynamically adjust to changing conditions and scenarios. However, since the design and development of novel security mechanisms can be explored from different perspectives and levels, we place our attention on threat modeling and access control. The contributions of the thesis can be summarized as follows. First, we introduce a model-based methodology that secures the design of edge and cyber-physical systems. This solution identifies threats, security controls, and moving target defense techniques based on system features. Then, we focus on access control management. Since access control policies are subject to modifications, we evaluate how they can be efficiently shared among distributed areas, highlighting the effectiveness of distributed ledger technologies. Furthermore, we propose a risk-based authorization middleware, adjusting permissions based on real-time data, and a federated learning framework that enhances trustworthiness by weighting each client's contributions according to the quality of their partial models. Finally, since authorization revocation is another critical concern, we present an efficient revocation scheme for verifiable credentials in IoT networks, featuring decentralization, demanding minimum storage and computing capabilities. All the mechanisms have been evaluated in different conditions, proving their adaptability to the Cloud-to-Thing Continuum landscape.
Resumo:
The morphological criteria for identification of intercalated duct lesions (IDLs) of salivary glands have been defined recently. It has been hypothesised that IDL could be a precursor of basal cell adenoma (BCA). BCAs show a variety of histological patterns, and the tubular variant is the one that presents the strongest resemblance with IDLs. The aim of this study was to analyse the morphological and immunohistochemical profiles of IDLs and BCAs classified into tubular and non-tubular subtypes, to determine whether or not IDL and tubular BCA represent distinct entities. Eight IDLs, nine tubular BCAs and 19 non-tubular BCAs were studied. All tubular BCAs contained IDL-like areas, which represented 20-70% of the tumour. In non-tubular BCA, IDL-like areas were occasional and small (<5%). One patient presented IDLs, tubular BCAs and IDL/tubular BCA combined lesions. Luminal ductal cells of IDLs and tubular BCAs exhibited positivity for CK7, lysozyme, S100 and DOG1. In the non-tubular BCA group, few luminal cells exhibited such an immunoprofile; they were mainly CK14-positive. Basal/myoepithelial cells of IDLs, tubular BCAs and non-tubular BCAs were positive for CK14, calponin, α-SMA and p63; they were more numerous in BCA lesions. IDL, tubular BCA and non-tubular BCA form a continuum of lesions in which IDLs are related closely to tubular BCA. In both, the immunoprofile of luminal and myoepithelial cells recapitulates the normal intercalated duct. The difference between the adenoma-like subset of IDLs and tubular BCA rests mainly on the larger numbers of myoepithelial cells in the latter. Our findings indicate that at least some BCAs can arise via IDLs.
Resumo:
The thermodynamic equilibrium is a state defined by conditions which depend upon some characteristics of the system. It requires thermal, mechanical, chemical and phase equilibrium. Continuum thermodynamics, its radical restriction usually called homogeneous processes thermodynamics, as well as the classical thermodynamic science of reversible processes, each of them defines equilibrium in a differing way. But these definitions lead to the same physical contents.
Resumo:
Considering intrinsic characteristics of the system exclusively, both statistical and information theory interpretations of the second law are used to provide more comprehensive meanings for the concepts of entropy, temperature, and Helmholtz and Gibbs energies. The coherence of Clausius inequality to these concepts is emphasized. The aim of this work is to re-discuss the second law of thermodynamics in accordance to homogeneous processes thermodynamics, a temporal science which is the very special oversimplification of continuum mechanics for spatially constant intensive properties.
Resumo:
The purpose of this study was to conduct an exploratory factorial analysis of Problems in School, a teachers' motivational styles evaluation instrument, constructed by Deci et al. The original instrument is in a Likert-scale format with the underlying assumption of the existence of a continuum of four different styles contributing to promote students' autonomy. Translated into portuguese, the instrument was applied to 582 elementary and junior high school teachers from several regions of Brazil. Factorial analyses revealed a solution with four orthogonal distinct factors, the authors' initial supposition (existence of a continuum) was not confirmed. In fact, only two opposite styles (both high promotion of autonomy and of control) corresponded to the Deci et al. original ideas. Problems regarding the validity of the other remaining styles emerged. Data was discussed and a revised version of the scale was developed. Directions for further research were also suggested.
Resumo:
OBJETIVO: Desenvolver um método e um dispositivo para quantificar a visão em candela (cd). Os estudos de medida da visão são importantes para todas as ciências visuais. MÉTODOS: É um estudo teórico e experimental. Foram descritos os detalhes do método psicofísico e da calibração do dispositivo. Foram realizados testes preliminares em voluntários. RESULTADOS: É um teste psicofísico simples e com resultado expresso em unidades do sistema internacional de medidas. Com a descrição técnica será possível reproduzir o experimento em outros centros de pesquisa. CONCLUSÃO: Os resultados aferidos em intensidade luminosa (cd) são uma opção para estudo visual. Esses resultados possibilitarão extrapolar medidas para modelos matemáticos e para simular efeitos individuais com dados aberrométricos.
Resumo:
A proposta deste estudo foi analisar o efeito do envelhecimento nos aspectos perceptivos e motores envolvidos com as ações de sentar e levantar de uma cadeira. Indivíduos jovens e idosos foram filmados enquanto sentavam/levantavam de uma cadeira em sete alturas diferentes do assento. Eles julgaram a dificuldade/facilidade encontrada para sentar e levantar em cada altura do assento. Os idosos exibiram mudanças na estratégia de controle usada para sentar na altura mais baixa do assento e superestimaram o nível de dificuldade/facilidade para realizar as tarefas de sentar e levantar. Em síntese, a percepção de execução fácil da tarefa de sentar pelos idosos não concorda com o grau de dificuldade exibido no desempenho motor na altura mais baixa do assento.
Resumo:
O trabalho busca integrar, com base em propostas recentes de vários autores, perspectivas acerca da aprendizagem concebidas como mutuamente excludentes. Essa reflexão se justifica em vista da importância de não se introduzir descontinuidade filogenética em um processo concebido como adaptativo, mas que é também cultural. Assim, são examinadas propostas acerca da coevolução da mente humana e da cultura que apoiariam tal perspectiva, propondo-se uma visão integrada da aprendizagem como um conjunto de processos organizados em um continuum implícito-explícito.
Resumo:
Apresentamos a Psicofísica como uma ciência aplicada nas investigações e nas abordagens e diagnósticos clínicos. Inicialmente, introduzimos algo dos aspectos epistemológicos e teóricos da área, passamos para as abordagens que a Psicofísica pode apresentar na aplicabilidade clínica e, por fim, discutimos os avanços recentes da aplicação clínica, apresentamos as experiências de nosso laboratório de pesquisa clínica em psicofísica, finalizando com as perspectivas de ampliação do uso da psicofísica para investigações clínicas de funções perceptuais mais complexas.
Resumo:
A cor é um atributo perceptual que nos permite identificar e localizar padrões ambientais de mesmo brilho e constitui uma dimensão adicional na identificação de objetos, além da detecção de inúmeros outros atributos dos objetos em sua relação com a cena visual, como luminância, contraste, forma, movimento, textura, profundidade. Decorre daí a sua importância fundamental nas atividades desempenhadas pelos animais e pelos seres humanos em sua interação com o ambiente. A psicofísica visual preocupa-se com o estudo quantitativo da relação entre eventos físicos de estimulação sensorial e a resposta comportamental resultante desta estimulação, fornecendo dessa maneira meios de avaliar aspectos da visão humana, como a visão de cores. Este artigo tem o objetivo de mostrar diversas técnicas eficientes na avaliação da visão cromática humana através de métodos psicofísicos adaptativos.