742 resultados para implementations


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ornamental plant production in the State of Florida is an anomaly with respect to current theories of globalization and particularly their explanation of the employment of low-wage, immigrant labor. Those theories dictate that unskilled jobs that do not need to be performed within highly developed countries are outsourced to where labor is cheaper and more flexible. However, the State of Florida remains an important site of ornamental plant production in the US amidst a global economic environment of outsourcing and transnational corporate expansion. This dissertation relies on 50 semi-structured interviews with insiders of the Florida plant nursery industry, focus groups, and participant observation to explain how US trade, labor, and migration policy-making at local levels are not removed from larger global processes taking place in the world since the 1970s. In Florida, elite market players of the plant nursery industry have been able to resist global trends in free trade, operating instead in a protected market. They have done this by appealing to scientific justifications and through arbitrary implementations of neoliberal ideology that keeps small and middle range business alive, while maintaining a seemingly endless supply of marginalized and exploited low-wage, immigrant workers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Support Vector Machines (SVMs) are widely used classifiers for detecting physiological patterns in Human-Computer Interaction (HCI). Their success is due to their versatility, robustness and large availability of free dedicated toolboxes. Frequently in the literature, insufficient details about the SVM implementation and/or parameters selection are reported, making it impossible to reproduce study analysis and results. In order to perform an optimized classification and report a proper description of the results, it is necessary to have a comprehensive critical overview of the application of SVM. The aim of this paper is to provide a review of the usage of SVM in the determination of brain and muscle patterns for HCI, by focusing on electroencephalography (EEG) and electromyography (EMG) techniques. In particular, an overview of the basic principles of SVM theory is outlined, together with a description of several relevant literature implementations. Furthermore, details concerning reviewed papers are listed in tables, and statistics of SVM use in the literature are presented. Suitability of SVM for HCI is discussed and critical comparisons with other classifiers are reported.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Image and video compression play a major role in the world today, allowing the storage and transmission of large multimedia content volumes. However, the processing of this information requires high computational resources, hence the improvement of the computational performance of these compression algorithms is very important. The Multidimensional Multiscale Parser (MMP) is a pattern-matching-based compression algorithm for multimedia contents, namely images, achieving high compression ratios, maintaining good image quality, Rodrigues et al. [2008]. However, in comparison with other existing algorithms, this algorithm takes some time to execute. Therefore, two parallel implementations for GPUs were proposed by Ribeiro [2016] and Silva [2015] in CUDA and OpenCL-GPU, respectively. In this dissertation, to complement the referred work, we propose two parallel versions that run the MMP algorithm in CPU: one resorting to OpenMP and another that converts the existing OpenCL-GPU into OpenCL-CPU. The proposed solutions are able to improve the computational performance of MMP by 3 and 2:7 , respectively. The High Efficiency Video Coding (HEVC/H.265) is the most recent standard for compression of image and video. Its impressive compression performance, makes it a target for many adaptations, particularly for holoscopic image/video processing (or light field). Some of the proposed modifications to encode this new multimedia content are based on geometry-based disparity compensations (SS), developed by Conti et al. [2014], and a Geometric Transformations (GT) module, proposed by Monteiro et al. [2015]. These compression algorithms for holoscopic images based on HEVC present an implementation of specific search for similar micro-images that is more efficient than the one performed by HEVC, but its implementation is considerably slower than HEVC. In order to enable better execution times, we choose to use the OpenCL API as the GPU enabling language in order to increase the module performance. With its most costly setting, we are able to reduce the GT module execution time from 6.9 days to less then 4 hours, effectively attaining a speedup of 45 .

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Projeto de Graduação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Licenciado em Fisioterapia

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract: After developing many sensor networks using custom protocols to save energy and minimise code complexity - we have now experimented with standards-based designs. These use IPv6 (6LowPAN), RPL routing, Coap for interfaces and data access and protocol buffers for data encapsulation. Deployments in the Cairngorm mountains have shown the capabilities and limitations of the implementations. This seminar will outline the hardware and software we used and discuss the advantages of the more standards-based approach. At the same time we have been progressing with high quality imaging of cultural heritage using the RTIdomes - so some results and designs will be shown as well. So this seminar will cover peat-bogs to museums, binary-HTTP-like REST to 3500 year old documents written on clay.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Knowledge organization in the networked environment is guided by standards. Standards in knowledge organization are built on principles. For example, NISO Z39.19-1993 Guide to the Construction of Monolingual Thesauri (now undergoing revision) and NISO Z39.85- 2001 Dublin Core Metadata Element Set are two standards used in many implementations. Both of these standards were crafted with knowledge organization principles in mind. Therefore it is standards work guided by knowledge organization principles which can affect design of information services and technologies. This poster outlines five threads of thought that inform knowledge organization principles in the networked environment. An understanding of each of these five threads informs system evaluation. The evaluation of knowledge organization systems should be tightly linked to a rigorous understanding of the principles of construction. Thus some foundational evaluation questions grow from an understanding of stan dard s and pr inciples: on what pr inciples is this know ledge organization system built? How well does this implementation meet the ideal conceptualization of those principles? How does this tool compare to others built on the same principles?

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El siguiente proyecto se llevará a cabo con el fin de presentar una propuesta de mejora a la empresa iCufiño Creative Solutions S.A.S en la gestión y desarrollo de los procesos internos de la compañía, tales como Diseño, Producción, Administración y Gerencia. La organización se dedica a brindar soluciones publicitarias para sus clientes y tiene como objetivos principales la entrega de proyectos de calidad para sobrepasar las expectativas de sus usuarios.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La presente investigación pretendió incorporar el uso intensivo de TIC en los procedimientos establecidos y necesarios en los procesos de generación, distribución y control de la energía, lo que se expresa en un manual para el sistema de gestión humana de la organización analizada. La investigación partió de un levantamiento de un estado del arte, continuó con la realización de un análisis de actitudes y aptitudes de los colaboradores, basado en propuestas teóricas y mejores prácticas existentes del medio, y, por último, concluyó con un manual de gestión humana en el que se indican las competencias en los diferentes perfiles de la organización para el uso de TIC y su aplicación, con el propósito de alinearse con las perspectivas y objetivos de la organización analizada al tener como base la perdurabilidad y la competitividad de la misma.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Streamflow is considered a driver of inter and intra‐specific life‐history differences among freshwater fish. Therefore, dams and related flow regulation, can have deleterious impacts on their life‐cycles. The main objective of this study is to assess the effects of flow regulation on the growth and reproduction of a non‐migratory fish species. During one year, samples were collected from two populations of Iberian chub, inhabiting rivers with non‐regulated and regulated flow regimes. Flow regulation for water derivation promoted changes in chub’s condition, duration of gonad maturation and spawning, fecundity and oocyte size. However, this non‐migratory species was less responsive to streamflow regulation than a migratory species analysed. Findings from this study are important to understand changes imposed by regulated rivers on fish and can be used as guidelines for flow requirements implementations; RESUMO: O caudal é um dos fatores responsáveis pelo funcionamento dos ciclos de vida das espécies piscícolas dulciaquícolas. As barragens, e a regularização de caudal associada, podem ter impactes nos ciclos de vida destas espécies. O objetivo deste estudo prende‐se com a avaliação dos efeitos da regularização de caudal no crescimento e reprodução de uma espécie piscícola não‐migradora. A análise de amostras recolhidas em populações de escalo do Norte provenientes de dois rios de caudal regularizado e não regularizado, identificaram impactes significativos a nível da condição corporal, da maturação das gónadas e desova, da fecundidade e da dimensão dos oócitos. Esta espécie não‐migradora parece ser menos responsiva à artificialização do caudal que uma espécie migradora previamente analisada. Estes resultados permitem compreender as alterações impostas pela regularização do caudal e podem ser usados em programas de reabilitação fluvial.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since their emergence, locally resonant metamaterials have found several applications for the control of surface waves, from micrometer-sized electronic devices to meter-sized seismic barriers. The interaction between Rayleigh-type surface waves and resonant metamaterials has been investigated through the realization of locally resonant metasurfaces, thin elastic interfaces constituted by a cluster of resonant inclusions or oscillators embedded near the surface of an elastic waveguide. When such resonant metasurfaces are embedded in an elastic homogeneous half-space, they can filter out the propagation of Rayleigh waves, creating low-frequency bandgaps at selected frequencies. In the civil engineering context, heavy resonating masses are needed to extend the bandgap frequency width of locally resonant devices, a requirement that limits their practical implementations. In this dissertation, the wave attenuation capabilities of locally resonant metasurfaces have been enriched by proposing (i) tunable metasurfaces to open large frequency bandgaps with small effective inertia, and by developing (ii) an analytical framework aimed at studying the propagation of Rayleigh waves propagation in deep resonant waveguides. In more detail, inertial amplified resonators are exploited to design advanced metasurfaces with a prescribed static and a tunable dynamic response. The modular design of the tunable metasurfaces allows to shift and enlarge low-frequency spectral bandgaps without modifying the total inertia of the metasurface. Besides, an original dispersion law is derived to study the dispersive properties of Rayleigh waves propagating in thick resonant layers made of sub-wavelength resonators. Accordingly, a deep resonant wave barrier of mechanical resonators embedded inside the soil is designed to impede the propagation of seismic surface waves. Numerical models are developed to confirm the analytical dispersion predictions of the tunable metasurface and resonant layer. Finally, a medium-size scale resonant wave barrier is designed according to the soil stratigraphy of a real geophysical scenario to attenuate ground-borne vibration.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Big data are reshaping the way we interact with technology, thus fostering new applications to increase the safety-assessment of foods. An extraordinary amount of information is analysed using machine learning approaches aimed at detecting the existence or predicting the likelihood of future risks. Food business operators have to share the results of these analyses when applying to place on the market regulated products, whereas agri-food safety agencies (including the European Food Safety Authority) are exploring new avenues to increase the accuracy of their evaluations by processing Big data. Such an informational endowment brings with it opportunities and risks correlated to the extraction of meaningful inferences from data. However, conflicting interests and tensions among the involved entities - the industry, food safety agencies, and consumers - hinder the finding of shared methods to steer the processing of Big data in a sound, transparent and trustworthy way. A recent reform in the EU sectoral legislation, the lack of trust and the presence of a considerable number of stakeholders highlight the need of ethical contributions aimed at steering the development and the deployment of Big data applications. Moreover, Artificial Intelligence guidelines and charters published by European Union institutions and Member States have to be discussed in light of applied contexts, including the one at stake. This thesis aims to contribute to these goals by discussing what principles should be put forward when processing Big data in the context of agri-food safety-risk assessment. The research focuses on two interviewed topics - data ownership and data governance - by evaluating how the regulatory framework addresses the challenges raised by Big data analysis in these domains. The outcome of the project is a tentative Roadmap aimed to identify the principles to be observed when processing Big data in this domain and their possible implementations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La presente trattazione affronta le principali problematiche giuridiche derivanti dall’apertura di una procedura concorsuale, esaminando le questioni di maggiore rilievo giuridico e operativo per il settore del trasporto marittimo in base ai due sistemi che, a livello sovranazionale, regolano l’insolvenza transfrontaliera, i.e. quello ispirato alla UNCITRAL Model Law e il Regolamento UE 848/2015. Le cornici normative UNCITRAL e UE hanno rappresentato, quindi, il punto di partenza dello scrutinio delle possibili aree di conflitto tra il trasporto marittimo e le procedure di insolvenza: sono emerse numerose zone di potenziale collisione, soprattutto in relazione ai criteri di collegamento tipici della navigazione (in primis, la bandiera quale elemento distintivo della nazionalità della nave) e, dunque, all’individuazione del centro degli interessi principali del debitore/armatore, soprattutto se – come di fatto avviene frequentemente in ambito internazionale – organizzato sotto forma di shipping group. Il secondo capitolo è dedicato, in senso lato, ai privilegi marittimi e al loro rapporto con le procedure di insolvenza, con precipuo riferimento all’ipoteca navale e ai maritime liens. A tale proposito, sono analizzate le principali problematiche correlate all’attuazione dei privilegi marittimi, segnatamente in relazione all’istituto del sequestro di nave di cui alla Convenzione di Bruxelles del 1952 nel contesto dell’insolvenza transfrontaliera. Il terzo e ultimo capitolo è dedicato alla limitazione di responsabilità quale istituto tipico del settore di riferimento, dalla prospettiva delle possibili interferenze tra la costituzione dei fondi di cui alle Convenzioni LLMC e CLC ed eventuali procedimenti concorsuali. La ricerca svolta ha dimostrato che l’universalità a cui ambiscono il Regolamento 848/2015 (già 1346/2000) e il sistema UNCITRAL risulta minata dalla coesistenza di una molteplicità di differenti interpretazioni e implementazioni, tali per cui l’insolvenza transfrontaliera delle compagnie di trasporto marittimo non risulta regolata in maniera uniforme, con conseguente possibilità di diverso trattamento di fattispecie e situazioni analoghe.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La tesi consiste nella descrizione del complessivo background storico-letterario, archeologico e digitale necessario per la realizzazione di un Atlante digitale dell’antica Grecia antica sulla base della raccolta e analisi dei dati e delle informazioni contenute nella Periegesi di Pausania. Grazie all’impiego degli applicativi GIS, ed in particolare di ArcGIS online, è stato possibile creare un database georiferito contenente le informazioni e le descrizioni fornite dal testo; ogni identificazione di un sito storico è stata inoltre confrontata con lo stato attuale della ricerca archeologica, al fine di produrre uno strumento innovativo tanto per a ricerca storico-archeologica quanto per lo studio e la valutazione dell’opera di Pausania. Nello specifico il lavoro consiste in primo esempio di atlante digitale interamente basato sull’interpretazione di un testo classico attraverso un processo di georeferenziazione dei suoi contenuti. Per ogni sito identificato è stata infatti specificato il relativo passo di Pausania, collegando direttamente Il dato archeologico con la fonte letteraria. Per la definizione di una tassonomia efficace per l’analisi dei contenuti dell’opera o, si è scelto di associare agli elementi descritti da Pausania sette livelli (layers) all’interno della mappa corrispondenti ad altrettante categorie generali (città, santuari extraurbani, monumenti, boschi sacri, località, corsi d’acqua, e monti). Per ciascun elemento sono state poi inserite ulteriori informazioni all’interno di una tabella descrittiva, quali: fonte, identificazione, età di appartenenza, e stato dell’identificazione.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The continuous and swift progression of both wireless and wired communication technologies in today's world owes its success to the foundational systems established earlier. These systems serve as the building blocks that enable the enhancement of services to cater to evolving requirements. Studying the vulnerabilities of previously designed systems and their current usage leads to the development of new communication technologies replacing the old ones such as GSM-R in the railway field. The current industrial research has a specific focus on finding an appropriate telecommunication solution for railway communications that will replace the GSM-R standard which will be switched off in the next years. Various standardization organizations are currently exploring and designing a radiofrequency technology based standard solution to serve railway communications in the form of FRMCS (Future Railway Mobile Communication System) to substitute the current GSM-R. Bearing on this topic, the primary strategic objective of the research is to assess the feasibility to leverage on the current public network technologies such as LTE to cater to mission and safety critical communication for low density lines. The research aims to identify the constraints, define a service level agreement with telecom operators, and establish the necessary implementations to make the system as reliable as possible over an open and public network, while considering safety and cybersecurity aspects. The LTE infrastructure would be utilized to transmit the vital data for the communication of a railway system and to gather and transmit all the field measurements to the control room for maintenance purposes. Given the significance of maintenance activities in the railway sector, the ongoing research includes the implementation of a machine learning algorithm to detect railway equipment faults, reducing time and human analysis errors due to the large volume of measurements from the field.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The pervasive availability of connected devices in any industrial and societal sector is pushing for an evolution of the well-established cloud computing model. The emerging paradigm of the cloud continuum embraces this decentralization trend and envisions virtualized computing resources physically located between traditional datacenters and data sources. By totally or partially executing closer to the network edge, applications can have quicker reactions to events, thus enabling advanced forms of automation and intelligence. However, these applications also induce new data-intensive workloads with low-latency constraints that require the adoption of specialized resources, such as high-performance communication options (e.g., RDMA, DPDK, XDP, etc.). Unfortunately, cloud providers still struggle to integrate these options into their infrastructures. That risks undermining the principle of generality that underlies the cloud computing scale economy by forcing developers to tailor their code to low-level APIs, non-standard programming models, and static execution environments. This thesis proposes a novel system architecture to empower cloud platforms across the whole cloud continuum with Network Acceleration as a Service (NAaaS). To provide commodity yet efficient access to acceleration, this architecture defines a layer of agnostic high-performance I/O APIs, exposed to applications and clearly separated from the heterogeneous protocols, interfaces, and hardware devices that implement it. A novel system component embodies this decoupling by offering a set of agnostic OS features to applications: memory management for zero-copy transfers, asynchronous I/O processing, and efficient packet scheduling. This thesis also explores the design space of the possible implementations of this architecture by proposing two reference middleware systems and by adopting them to support interactive use cases in the cloud continuum: a serverless platform and an Industry 4.0 scenario. A detailed discussion and a thorough performance evaluation demonstrate that the proposed architecture is suitable to enable the easy-to-use, flexible integration of modern network acceleration into next-generation cloud platforms.