967 resultados para Lock-In
Resumo:
Pós-graduação em Ciência da Computação - IBILCE
Resumo:
A novel design based on electric field-free open microwell arrays for the automated continuous-flow sorting of single or small clusters of cells is presented. The main feature of the proposed device is the parallel analysis of cell-cell and cell-particle interactions in each microwell of the array. High throughput sample recovery with a fast and separate transfer from the microsites to standard microtiter plates is also possible thanks to the flexible printed circuit board technology which permits to produce cost effective large area arrays featuring geometries compatible with laboratory equipment. The particle isolation is performed via negative dielectrophoretic forces which convey the particles’ into the microwells. Particles such as cells and beads flow in electrically active microchannels on whose substrate the electrodes are patterned. The introduction of particles within the microwells is automatically performed by generating the required feedback signal by a microscope-based optical counting and detection routine. In order to isolate a controlled number of particles we created two particular configurations of the electric field within the structure. The first one permits their isolation whereas the second one creates a net force which repels the particles from the microwell entrance. To increase the parallelism at which the cell-isolation function is implemented, a new technique based on coplanar electrodes to detect particle presence was implemented. A lock-in amplifying scheme was used to monitor the impedance of the channel perturbed by flowing particles in high-conductivity suspension mediums. The impedance measurement module was also combined with the dielectrophoretic focusing stage situated upstream of the measurement stage, to limit the measured signal amplitude dispersion due to the particles position variation within the microchannel. In conclusion, the designed system complies with the initial specifications making it suitable for cellomics and biotechnology applications.
Resumo:
The promising development in the routine nanofabrication and the increasing knowledge of the working principles of new classes of highly sensitive, label-free and possibly cost-effective bio-nanosensors for the detection of molecules in liquid environment, has rapidly increased the possibility to develop portable sensor devices that could have a great impact on many application fields, such as health-care, environment and food production, thanks to the intrinsic ability of these biosensors to detect, monitor and study events at the nanoscale. Moreover, there is a growing demand for low-cost, compact readout structures able to perform accurate preliminary tests on biosensors and/or to perform routine tests with respect to experimental conditions avoiding skilled personnel and bulky laboratory instruments. This thesis focuses on analysing, designing and testing novel implementation of bio-nanosensors in layered hybrid systems where microfluidic devices and microelectronic systems are fused in compact printed circuit board (PCB) technology. In particular the manuscript presents hybrid systems in two validating cases using nanopore and nanowire technology, demonstrating new features not covered by state of the art technologies and based on the use of two custom integrated circuits (ICs). As far as the nanopores interface system is concerned, an automatic setup has been developed for the concurrent formation of bilayer lipid membranes combined with a custom parallel readout electronic system creating a complete portable platform for nanopores or ion channels studies. On the other hand, referring to the nanowire readout hybrid interface, two systems enabling to perform parallel, real-time, complex impedance measurements based on lock-in technique, as well as impedance spectroscopy measurements have been developed. This feature enable to experimentally investigate the possibility to enrich informations on the bio-nanosensors concurrently acquiring impedance magnitude and phase thus investigating capacitive contributions of bioanalytical interactions on biosensor surface.
Resumo:
A microfluidic Organ-on-Chip has been developed for monitoring the epithelial cells monolayer. Equivalent circuit Model was used to determine the electrical properties from the impedance spectra of the epithelial cells monolayer. Black platinum on platinum electrodes was electrochemically deposited onto the surface of electrodes to reduce the influence of the electrical double layer on the impedance measurements. Measurements of impedance with an Impedance Analyzer were done to validate the equivalent circuit model and the decrease of the double layer effect. A Lock-in Amplifier was designed to measure the impedance.
Resumo:
Since the UsedSoft ruling of the CJEU in 2012, there has been the distinct feeling that – like the big bang - UsedSoft signals the start of a new beginning. As we enter this brave new world, the Copyright Directive will be read anew: misalignments in the treatment of physical and digital content will be resolved; accessibility and affordability for consumers will be heightened; and lock-in will be reduced as e-exhaustion takes hold. With UsedSoft as a precedent, the Court can do nothing but keep expanding its own ruling. For big bang theorists, it is only a matter of time until the digital first sale meteor strikes non-software downloads also. This paper looks at whether the UsedSoft ruling could indeed be the beginning of a wider doctrine of e-exhaustion, or if it is simply a one-shot comet restrained by provisions of the Computer Program Directive on which it was based. Fighting the latter corner, we have the strict word of the law; in the UsedSoft ruling, the Court appears to willingly bypass the international legal framework of the WCT. As far as expansion goes, the Copyright Directive was conceived specifically to implement the WCT, thus the legislative intent is clear. The Court would not, surely, invoke its modicum of creativity there also... With perhaps undue haste in a digital market of many unknowns, it seems this might well be the case. Provoking the big bang theory of e-exhaustion, the UsedSoft ruling can be read as distinctly purposive, but rather than having copyright norms in mind, the standard for the Court is the same free movement rules that underpin the exhaustion doctrine in the physical world. With an endowed sense of principled equivalence, the Court clearly wishes the tangible and intangible rules to be aligned. Against the backdrop of the European internal market, perhaps few legislative instruments would staunchly stand in its way. With firm objectives in mind, the UsedSoft ruling could be a rather disruptive meteor indeed.
Resumo:
Correct estimation of the firn lock-in depth is essential for correctly linking gas and ice chronologies in ice core studies. Here, two approaches to constrain the firn depth evolution in Antarctica are presented over the last deglaciation: outputs of a firn densification model, and measurements of δ15N of N2 in air trapped in ice core, assuming that δ15N is only affected by gravitational fractionation in the firn column. Since the firn densification process is largely governed by surface temperature and accumulation rate, we have investigated four ice cores drilled in coastal (Berkner Island, BI, and James Ross Island, JRI) and semi-coastal (TALDICE and EPICA Dronning Maud Land, EDML) Antarctic regions. Combined with available ice core air-δ15N measurements from the EPICA Dome C (EDC) site, the studied regions encompass a large range of surface accumulation rates and temperature conditions. Our δ15N profiles reveal a heterogeneous response of the firn structure to glacial–interglacial climatic changes. While firn densification simulations correctly predict TALDICE δ15N variations, they systematically fail to capture the large millennial-scale δ15N variations measured at BI and the δ15N glacial levels measured at JRI and EDML – a mismatch previously reported for central East Antarctic ice cores. New constraints of the EDML gas–ice depth offset during the Laschamp event (~41 ka) and the last deglaciation do not favour the hypothesis of a large convective zone within the firn as the explanation of the glacial firn model–δ15N data mismatch for this site. While we could not conduct an in-depth study of the influence of impurities in snow for firnification from the existing datasets, our detailed comparison between the δ15N profiles and firn model simulations under different temperature and accumulation rate scenarios suggests that the role of accumulation rate may have been underestimated in the current description of firnification models.
Resumo:
If we postulate a need for the transformation of society towards sustainable development, we also need to transform science and overcome the fact/value split that makes it impossible for science to be accountable to society. The orientation of this paradigm transformation in science has been under debate for four decades, generating important theoretical concepts, but they have had limited impact until now. This is due to a contradictory normative science policy framing that science has difficulties dealing with, not least of all because the dominant framing creates a lock-in. We postulate that in addition to introducing transdisciplinarity, science needs to strive for integration of the normative aspect of sustainable development at the meta-level. This requires a strategically managed niche within which scholars and practitioners from many different disciplines can engage in a long-term common learning process, in order to become a “thought collective” (Fleck) capable of initiating the paradigm transformation. Arguing with Piaget that “decentration” is essential to achieve normative orientation and coherence in a learning collective, we introduce a learning approach—Cohn's “Theme-Centred Interaction”—which provides a methodology for explicitly working with the objectivity and subjectivity of statements and positions in a “real-world” context, and for consciously integrating concerns of individuals in their interdependence with the world. This should enable a thought collective to address the epistemological and ethical barriers to science for sustainable development.
Resumo:
In his contribution, Joppke justifies his selection of foundational scholars by linking each to what he sees as the three key facets of citizenship: status, rights and identity. Maarten Vink explicitly links his research agenda to the first, status, and outlines why it is so important. In identifying three facets of citizenship, Joppke acknowledges that some academics would include political participation, but he ultimately decides against it. But here we can, and should, broaden citizenship studies by bringing in insights from the behavioral politics tradition in domestic politics - when and why people engage in political acts - and from the social movements literature in sociology. I believe that the American debate on immigration reform, admittedly stalled, would not have advanced as far as it has without the social movement activism of DREAMers - unauthorized young people pushing for a path to citizenship - and the belief that Barack Obama won re-election in part because of the Latino vote. Importantly, one type of political activism demands formal citizenship, the other does not. As many contributors note, the “national models” approach has had a significant impact on citizenship studies. Whether one views such models through a cultural, institutional or historical lens, this tends to be a top-down, macro-level framework. What about immigrants’ agency? In Canada, although the ruling Conservative government is shifting citizenship discourse to a more traditional language - as Winter points out - it has not reduced immigration, ended dual citizenship, or eliminated multiculturalism, all goals of the Reform Party that the current prime minister once helped build. “Lock-in” effects (or policy feedback loops) based on high immigrant naturalization and the coming of age of a second-generation with citizenship also d emands study, in North America and elsewhere. Much of the research thus far suggests that political decisions over citizenship status and rights do not seem linked to immigrants’ political activism. State-centered decision-making may have characterized policy in the early post-World War II period in Europe (and East Asia?), but does it continue to hold today? Majority publics and immigrant-origin residents are increasingly politicized around citizenship and immigration. Does immigrant agency extend citizenship status, rights and identity to those born outside the polity? Is electoral power key, or is protest necessary? How is citizenship practiced, and contested, irrespective of formal status? These are important and understudied empirical questions, ones that demand theoretical creativity - across sub-fields and disciplines - in conceptualizing and understanding citizenship in contemporary times.
Resumo:
Abstract of Bazin et al. (2013): An accurate and coherent chronological framework is essential for the interpretation of climatic and environmental records obtained from deep polar ice cores. Until now, one common ice core age scale had been developed based on an inverse dating method (Datice), combining glaciological modelling with absolute and stratigraphic markers between 4 ice cores covering the last 50 ka (thousands of years before present) (Lemieux-Dudon et al., 2010). In this paper, together with the companion paper of Veres et al. (2013), we present an extension of this work back to 800 ka for the NGRIP, TALDICE, EDML, Vostok and EDC ice cores using an improved version of the Datice tool. The AICC2012 (Antarctic Ice Core Chronology 2012) chronology includes numerous new gas and ice stratigraphic links as well as improved evaluation of background and associated variance scenarios. This paper concentrates on the long timescales between 120-800 ka. In this framework, new measurements of d18Oatm over Marine Isotope Stage (MIS) 11-12 on EDC and a complete d18Oatm record of the TALDICE ice cores permit us to derive additional orbital gas age constraints. The coherency of the different orbitally deduced ages (from d18Oatm, dO2/N2 and air content) has been verified before implementation in AICC2012. The new chronology is now independent of other archives and shows only small differences, most of the time within the original uncertainty range calculated by Datice, when compared with the previous ice core reference age scale EDC3, the Dome F chronology, or using a comparison between speleothems and methane. For instance, the largest deviation between AICC2012 and EDC3 (5.4 ka) is obtained around MIS 12. Despite significant modifications of the chronological constraints around MIS 5, now independent of speleothem records in AICC2012, the date of Termination II is very close to the EDC3 one. Abstract of Veres et al. (2013): The deep polar ice cores provide reference records commonly employed in global correlation of past climate events. However, temporal divergences reaching up to several thousand years (ka) exist between ice cores over the last climatic cycle. In this context, we are hereby introducing the Antarctic Ice Core Chronology 2012 (AICC2012), a new and coherent timescale developed for four Antarctic ice cores, namely Vostok, EPICA Dome C (EDC), EPICA Dronning Maud Land (EDML) and Talos Dome (TALDICE), alongside the Greenlandic NGRIP record. The AICC2012 timescale has been constructed using the Bayesian tool Datice (Lemieux-Dudon et al., 2010) that combines glaciological inputs and data constraints, including a wide range of relative and absolute gas and ice stratigraphic markers. We focus here on the last 120 ka, whereas the companion paper by Bazin et al. (2013) focuses on the interval 120-800 ka. Compared to previous timescales, AICC2012 presents an improved timing for the last glacial inception, respecting the glaciological constraints of all analyzed records. Moreover, with the addition of numerous new stratigraphic markers and improved calculation of the lock-in depth (LID) based on d15N data employed as the Datice background scenario, the AICC2012 presents a slightly improved timing for the bipolar sequence of events over Marine Isotope Stage 3 associated with the seesaw mechanism, with maximum differences of about 600 yr with respect to the previous Datice-derived chronology of Lemieux-Dudon et al. (2010), hereafter denoted LD2010. Our improved scenario confirms the regional differences for the millennial scale variability over the last glacial period: while the EDC isotopic record (events of triangular shape) displays peaks roughly at the same time as the NGRIP abrupt isotopic increases, the EDML isotopic record (events characterized by broader peaks or even extended periods of high isotope values) reached the isotopic maximum several centuries before. It is expected that the future contribution of both other long ice core records and other types of chronological constraints to the Datice tool will lead to further refinements in the ice core chronologies beyond the AICC2012 chronology. For the time being however, we recommend that AICC2012 be used as the preferred chronology for the Vostok, EDC, EDML and TALDICE ice core records, both over the last glacial cycle (this study), and beyond (following Bazin et al., 2013). The ages for NGRIP in AICC2012 are virtually identical to those of GICC05 for the last 60.2 ka, whereas the ages beyond are independent of those in GICC05modelext (as in the construction of AICC2012, the GICC05modelext was included only via the background scenarios and not as age markers). As such, where issues of phasing between Antarctic records included in AICC2012 and NGRIP are involved, the NGRIP ages in AICC2012 should therefore be taken to avoid introducing false offsets. However for issues involving only Greenland ice cores, there is not yet a strong basis to recommend superseding GICC05modelext as the recommended age scale for Greenland ice cores.
Resumo:
This paper proposes a general equilibrium model of a monocentric city based on Fujita and Krugman (1995). Two rates of transport costs per distance and for the same good are introduced. The model assumes that lower transport costs are available at a few points on a line. These lower costs represent new transport facilities, such as high-speed motorways and railways. Findings is that new transport facilities connecting the city and hinterlands strengthen the lock-in effects, which describes whether a city remains where it is forever after being created. Furthermore, the effect intensifies with better agricultural technologies and a larger population in the economy. The relationship between indirect utility and population size has an inverted U-shape, even if new transport facilities are used. However, the population size that maximizes indirect utility is smaller than that found in Fujita and Krugman (1995).