903 resultados para Data-driven knowledge acquisition
Resumo:
This paper introduces an area- and power-efficient approach for compressive recording of cortical signals used in an implantable system prior to transmission. Recent research on compressive sensing has shown promising results for sub-Nyquist sampling of sparse biological signals. Still, any large-scale implementation of this technique faces critical issues caused by the increased hardware intensity. The cost of implementing compressive sensing in a multichannel system in terms of area usage can be significantly higher than a conventional data acquisition system without compression. To tackle this issue, a new multichannel compressive sensing scheme which exploits the spatial sparsity of the signals recorded from the electrodes of the sensor array is proposed. The analysis shows that using this method, the power efficiency is preserved to a great extent while the area overhead is significantly reduced resulting in an improved power-area product. The proposed circuit architecture is implemented in a UMC 0.18 [Formula: see text]m CMOS technology. Extensive performance analysis and design optimization has been done resulting in a low-noise, compact and power-efficient implementation. The results of simulations and subsequent reconstructions show the possibility of recovering fourfold compressed intracranial EEG signals with an SNR as high as 21.8 dB, while consuming 10.5 [Formula: see text]W of power within an effective area of 250 [Formula: see text]m × 250 [Formula: see text]m per channel.
Resumo:
Despite an increased scientific interest in the relatively new phenomenon of large-scale land acquisition (LSLA), data on processes on the local level remain sparse and superficial. However, knowledge about the concrete implementation of LSLA projects and the different impacts they have on the heterogeneous group of project affected people is indispensable for a deepened understanding of the phenomenon. In order to address this research gap, a team of two anthropologists and a human geographer conducted in-depth fieldwork on the LSLA project of Swiss based Addax Bioenergy in Sierra Leone. After the devastating civil war, the Sierra Leonean government created favourable conditions for foreign investors willing to lease large areas of land and to bring “development” to the country. Being one of the numerous investing companies, Addax Bioenergy has leased 57’000 hectares of land to develop a sugarcane plantation and an ethanol factory to produce biofuel for the export to the European market. Based on participatory observation, qualitative interview techniques and a network analysis, the research team aimed a) at identifying the different actors that were necessary for the implementation of this project on a vertical level and b) exploring various impacts of the project in the local context of two villages on a horizontal level. The network analysis reveals a complex pattern of companies, institutions, nongovernmental organisations and prominent personalities acting within a shifting technological and discursive framework linking global scales to a unique local context. Findings from the latter indicate that affected people initially welcomed the project but now remain frustrated since many promises and expectations have not been fulfilled. Although some local people are able to benefit from the project, the loss of natural resources that comes along with the land lease affects livelihoods of vulnerable groups – especially women and land users – considerably. However, this research doesn’t only disclose impacts on local people’s previous lives but also addresses strategies they adopt in the newly created situation that has opened up alternative spaces for renegotiations of power and legitimatisation. Therewith, this explorative study reveals new aspects of LSLA that have not been considered adequately by the investing company nor by the general academic discourse on LSLA.
Resumo:
Despite increased scientific interest in the phenomenon of large-scale land acquisitions (LSLA), accurate data on implementation processes remain sparse. This paper aims at filling this gap by providing empirical in-depth knowledge on the case of the Swiss-based Addax Bioenergy Ltd. in Sierra Leone. Extensive fieldwork allowed the interdisciplinary research team 1) the identification of different actors that are necessary for the implementation on a vertical level and 2) the documentation of the heterogeneous group of project affected people’s perceptions and strategies on a horizontal level. Findings reveal that even a project labeled as best-practice example by UN agencies triggers a number of problematic processes for affected communities. The loss of natural resources that comes along with the land lease and the lack of employment possibilities mostly affects already vulnerable groups. On the other hand, strategies and resistance of local people also affect the project implementation. This shows that the horizontal and vertical levels are not separate entities. They are linked by social networks, social interactions, and means of communication and both levels take part in shaping the project’s impacts.
Resumo:
Increased CO2 and associated acidification in seawater, known as ocean acidification, decreases calcification of most marine calcifying organisms. However, there is little information available on how marine macroalgae would respond to the chemical changes caused by seawater acidification. We hypothesized that down-regulation of bicarbonate acquisition by algae under increased acidity and CO2 levels would lower the threshold above which photosynthetically active radiation (PAR) becomes excessive. Juveniles of Ulva prolifera derived from zoospores were grown at ambient (390 µatm) and elevated (1000 µatm) CO2 concentrations for 80 days before the hypothesis was tested. Here, the CO2-induced seawater acidification increased the quantum yield under low levels of light, but induced higher nonphotochemical quenching under high light. At the same time, the PAR level at which photosynthesis became saturated was decreased and the photosynthetic affinity for CO2 or inorganic carbon decreased in the high-CO2 grown plants. These findings indicated that ocean acidification, as an environmental stressor, can reduce the threshold above which PAR becomes excessive.
Resumo:
Macrocystis pyrifera is a widely distributed, highly productive, seaweed. It is known to use bicarbonate (HCO3-) from seawater in photosynthesis and the main mechanism of utilization is attributed to the external catalyzed dehydration of HCO3- by the surface-bound enzyme carbonic anhydrase (CAext). Here, we examined other putative HCO3- uptake mechanisms in M. pyrifera under pHT 9.00 (HCO3-: CO2 = 940:1) and pHT 7.65 (HCO3-: CO2 = 51:1). Rates of photosynthesis, and internal CA (CAint) and CAext activity were measured following the application of AZ which inhibits CAext, and DIDS which inhibits a different HCO3- uptake system, via an anion exchange (AE) protein. We found that the main mechanism of HCO3- uptake by M. pyrifera is via an AE protein, regardless of the HCO3-: CO2 ratio, with CAext making little contribution. Inhibiting the AE protein led to a 55%-65% decrease in photosynthetic rates. Inhibiting both the AE protein and CAext at pHT 9.00 led to 80%-100% inhibition of photosynthesis, whereas at pHT 7.65, passive CO2 diffusion supported 33% of photosynthesis. CAint was active at pHT 7.65 and 9.00, and activity was always higher than CAext, because of its role in dehydrating HCO3- to supply CO2 to RuBisCO. Interestingly, the main mechanism of HCO3- uptake in M. pyrifera was different than that in other Laminariales studied (CAext-catalyzed reaction) and we suggest that species-specific knowledge of carbon uptake mechanisms is required in order to elucidate how seaweeds might respond to future changes in HCO3-:CO2 due to ocean acidification.
Resumo:
A small Positron Emission Tomography demonstrator based on LYSO slabs and Silicon Photomultiplier matrices is under construction at the University and INFN of Pisa. In this paper we present the characterization results of the read-out electronics and of the detection system. Two SiPM matrices, composed by 8 × 8 SiPM pixels, 1.5 mm pitch, have been coupled one to one to a LYSO crystals array. Custom Front-End ASICs were used to read the 64 channels of each matrix. Data from each Front-End were multiplexed and sent to a DAQ board for the digital conversion; a motherboard collects the data and communicates with a host computer through a USB port. Specific tests were carried out on the system in order to assess its performance. Futhermore we have measured some of the most important parameters of the system for PET application.
Resumo:
In parallel to the effort of creating Open Linked Data for the World Wide Web there is a number of projects aimed for developing the same technologies but in the context of their usage in closed environments such as private enterprises. In the paper, we present results of research on interlinking structured data for use in Idea Management Systems - a still rare breed of knowledge management systems dedicated to innovation management. In our study, we show the process of extending an ontology that initially covers only the Idea Management System structure towards the concept of linking with distributed enterprise data and public data using Semantic Web technologies. Furthermore we point out how the established links can help to solve the key problems of contemporary Idea Management Systems
Resumo:
El objetivo de este proyecto es diseñar un sistema capaz de controlar la velocidad de rotación de un motor DC en función del valor de temperatura obtenido de un sensor. Para ello se generará con un microcontrolador una señal PWM, cuyo ciclo de trabajo estará en función de la temperatura medida. En lo que respecta a la fase de diseño, hay dos partes claramente diferenciadas, relativas al hardware y al software. En cuanto al diseño del hardware puede hacerse a su vez una división en dos partes. En primer lugar, hubo que diseñar la circuitería necesaria para adaptar los niveles de tensión entregados por el sensor de temperatura a los niveles requeridos por ADC, requerido para digitalizar la información para su posterior procesamiento por parte del microcontrolador. Por tanto hubo que diseñar capaz de corregir el offset y la pendiente de la función tensión-temperatura del sensor, a fin de adaptarlo al rango de tensión requerido por el ADC. Por otro lado, hubo que diseñar el circuito encargado de controlar la velocidad de rotación del motor. Este circuito estará basado en un transistor MOSFET en conmutación, controlado mediante una señal PWM como se mencionó anteriormente. De esta manera, al variar el ciclo de trabajo de la señal PWM, variará de manera proporcional la tensión que cae en el motor, y por tanto su velocidad de rotación. En cuanto al diseño del software, se programó el microcontrolador para que generase una señal PWM en uno de sus pines en función del valor entregado por el ADC, a cuya entrada está conectada la tensión obtenida del circuito creado para adaptar la tensión generada por el sensor. Así mismo, se utiliza el microcontrolador para representar el valor de temperatura obtenido en una pantalla LCD. Para este proyecto se eligió una placa de desarrollo mbed, que incluye el microcontrolador integrado, debido a que facilita la tarea del prototipado. Posteriormente se procedió a la integración de ambas partes, y testeado del sistema para comprobar su correcto funcionamiento. Puesto que el resultado depende de la temperatura medida, fue necesario simular variaciones en ésta, para así comprobar los resultados obtenidos a distintas temperaturas. Para este propósito se empleó una bomba de aire caliente. Una vez comprobado el funcionamiento, como último paso se diseñó la placa de circuito impreso. Como conclusión, se consiguió desarrollar un sistema con un nivel de exactitud y precisión aceptable, en base a las limitaciones del sistema. SUMMARY: It is obvious that day by day people’s daily life depends more on technology and science. Tasks tend to be done automatically, making them simpler and as a result, user life is more comfortable. Every single task that can be controlled has an electronic system behind. In this project, a control system based on a microcontroller was designed for a fan, allowing it to go faster when temperature rises or slowing down as the environment gets colder. For this purpose, a microcontroller was programmed to generate a signal, to control the rotation speed of the fan depending on the data acquired from a temperature sensor. After testing the whole design developed in the laboratory, the next step taken was to build a prototype, which allows future improvements in the system that are discussed in the corresponding section of the thesis.
Resumo:
Several activities in service oriented computing, such as automatic composition, monitoring, and adaptation, can benefit from knowing properties of a given service composition before executing them. Among these properties we will focus on those related to execution cost and resource usage, in a wide sense, as they can be linked to QoS characteristics. In order to attain more accuracy, we formulate execution costs / resource usage as functions on input data (or appropriate abstractions thereof) and show how these functions can be used to make better, more informed decisions when performing composition, adaptation, and proactive monitoring. We present an approach to, on one hand, synthesizing these functions in an automatic fashion from the definition of the different orchestrations taking part in a system and, on the other hand, to effectively using them to reduce the overall costs of non-trivial service-based systems featuring sensitivity to data and possibility of failure. We validate our approach by means of simulations of scenarios needing runtime selection of services and adaptation due to service failure. A number of rebinding strategies, including the use of cost functions, are compared.
Resumo:
Expert systems are built from knowledge traditionally elicited from the human expert. It is precisely knowledge elicitation from the expert that is the bottleneck in expert system construction. On the other hand, a data mining system, which automatically extracts knowledge, needs expert guidance on the successive decisions to be made in each of the system phases. In this context, expert knowledge and data mining discovered knowledge can cooperate, maximizing their individual capabilities: data mining discovered knowledge can be used as a complementary source of knowledge for the expert system, whereas expert knowledge can be used to guide the data mining process. This article summarizes different examples of systems where there is cooperation between expert knowledge and data mining discovered knowledge and reports our experience of such cooperation gathered from a medical diagnosis project called Intelligent Interpretation of Isokinetics Data, which we developed. From that experience, a series of lessons were learned throughout project development. Some of these lessons are generally applicable and others pertain exclusively to certain project types.
Resumo:
We are investigating the performances of a data acquisition system for Time of Flight PET, based on LYSO crystal slabs and 64 channels Silicon Photomultipliers matrices (1.2 cm2 of active area each). Measurements have been performed to test the timing capability of the detection system (SiPM matices coupled to a LYSO slab and the read-out electronics) with both test signal and radioactive source.
Resumo:
This paper presents a data-intensive architecture that demonstrates the ability to support applications from a wide range of application domains, and support the different types of users involved in defining, designing and executing data-intensive processing tasks. The prototype architecture is introduced, and the pivotal role of DISPEL as a canonical language is explained. The architecture promotes the exploration and exploitation of distributed and heterogeneous data and spans the complete knowledge discovery process, from data preparation, to analysis, to evaluation and reiteration. The architecture evaluation included large-scale applications from astronomy, cosmology, hydrology, functional genetics, imaging processing and seismology.