939 resultados para extraction system
Resumo:
The interference of some specific aqueous two-phase system (ATPS) phase-forming components in bovine serum albumin (BSA) determination by the Bradford method was investigated. For this purpose, calibration curves were obtained for BSA in the presence of different concentrations of salts and polymers. A total of 19 salts [Na2SO4, (NH4)(2)SO4, MgSO4, LiSO4, Na2HPO4, sodium phosphate buffer (pH 7.0), NaH2PO4, K2HPO4, potassium phosphate buffer (pH 7.0), KH2PO4, C6H8O7, Na3C6HSO7, KCHO2, NaCHO2, NaCO3, NaHCO3, C2H4O2, sodium acetate buffer (pH 4.5), and NaC2H3O2] and 7 polymers [PEG 4000, PEG 8000, PEG 20000, UCON 3900, Ficoll 70000, PES 100000, and PVP 40000] were tested, and each calibration curve was compared with the one obtained for BSA in water. Some concentrations of salts and polymers had considerable effect in the BSA calibration curve. Carbonate salts were responsible for the highest salt interference, whereas citric and acetic acids did not produce interference even in the maximum concentration level tested (5 wt%). Among the polymers, UCON gave the highest interference, whereas Ficoll did not produce interference when used in concentrations up to 10 wt%. It was concluded that a convenient dilution of the samples prior to the protein quantification is needed to ensure no significant interference from ATPS phase-forming constituents. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
This research reports liquid liquid equilibrium data for the system lard (swine fat), cis-9-octadecenoic acid (oleic acid), ethanol, and water at 318.2 K, as well as their correlation with the nonrandom two-liquid (NRTL) and universal quasichemical activity coefficient (UNIQUAC) thermodynamic equations, which have provided global deviations of 0.41 % and 0.53 %, respectively. Additional equilibrium experiments were also performed to obtain cholesterol partition (or distribution) coefficients to verify the availability of the use of ethanol plus water to reduce the cholesterol content in lard. The partition experiments were performed with concentrations of free fatty acids (commercial oleic acid) that varied from (0 to 20) mass % and of water in the solvent that varied from (0 to 18) mass %. The percentage of free fatty acids initially present in lard had a slight effect on the distribution of cholesterol between the phases. Furthermore, the distribution coefficients decreased by adding water in the ethanol; specifically, it resulted in a diminution of the capability of the solvent to remove the cholesterol.
Resumo:
[EN] During maximal whole body exercise VO2 peak is limited by O2 delivery. In turn, it is though that blood flow at near-maximal exercise must be restrained by the sympathetic nervous system to maintain mean arterial pressure. To determine whether enhancing vasodilation across the leg results in higher O2 delivery and leg VO2 during near-maximal and maximal exercise in humans, seven men performed two maximal incremental exercise tests on the cycle ergometer. In random order, one test was performed with and one without (control exercise) infusion of ATP (8 mg in 1 ml of isotonic saline solution) into the right femoral artery at a rate of 80 microg.kg body mass-1.min-1. During near-maximal exercise (92% of VO2 peak), the infusion of ATP increased leg vascular conductance (+43%, P<0.05), leg blood flow (+20%, 1.7 l/min, P<0.05), and leg O2 delivery (+20%, 0.3 l/min, P<0.05). No effects were observed on leg or systemic VO2. Leg O2 fractional extraction was decreased from 85+/-3 (control) to 78+/-4% (ATP) in the infused leg (P<0.05), while it remained unchanged in the left leg (84+/-2 and 83+/-2%; control and ATP; n=3). ATP infusion at maximal exercise increased leg vascular conductance by 17% (P<0.05), while leg blood flow tended to be elevated by 0.8 l/min (P=0.08). However, neither systemic nor leg peak VO2 values where enhanced due to a reduction of O2 extraction from 84+/-4 to 76+/-4%, in the control and ATP conditions, respectively (P<0.05). In summary, the VO2 of the skeletal muscles of the lower extremities is not enhanced by limb vasodilation at near-maximal or maximal exercise in humans. The fact that ATP infusion resulted in a reduction of O2 extraction across the exercising leg suggests a vasodilating effect of ATP on less-active muscle fibers and other noncontracting tissues and that under normal conditions these regions are under high vasoconstrictor influence to ensure the most efficient flow distribution of the available cardiac output to the most active muscle fibers of the exercising limb.
Resumo:
Salt deposits characterize the subsurface of Tuzla (BiH) and made it famous since the ancient times. Archeological discoveries demonstrate the presence of a Neolithic pile-dwelling settlement related to the existence of saltwater springs that contributed to make the most of the area a swampy ground. Since the Roman times, the town is reported as “the City of Salt deposits and Springs”; "tuz" is the Turkish word for salt, as the Ottomans renamed the settlement in the 15th century following their conquest of the medieval Bosnia (Donia and Fine, 1994). Natural brine springs were located everywhere and salt has been evaporated by means of hot charcoals since pre-Roman times. The ancient use of salt was just a small exploitation compared to the massive salt production carried out during the 20th century by means of classical mine methodologies and especially wild brine pumping. In the past salt extraction was practised tapping natural brine springs, while the modern technique consists in about 100 boreholes with pumps tapped to the natural underground brine runs, at an average depth of 400-500 m. The mining operation changed the hydrogeological conditions enabling the downward flow of fresh water causing additional salt dissolution. This process induced severe ground subsidence during the last 60 years reaching up to 10 meters of sinking in the most affected area. Stress and strain of the overlying rocks induced the formation of numerous fractures over a conspicuous area (3 Km2). Consequently serious damages occurred to buildings and infrastructures such as water supply system, sewage networks and power lines. Downtown urban life was compromised by the destruction of more than 2000 buildings that collapsed or needed to be demolished causing the resettlement of about 15000 inhabitants (Tatić, 1979). Recently salt extraction activities have been strongly reduced, but the underground water system is returning to his natural conditions, threatening the flooding of the most collapsed area. During the last 60 years local government developed a monitoring system of the phenomenon, collecting several data about geodetic measurements, amount of brine pumped, piezometry, lithostratigraphy, extension of the salt body and geotechnical parameters. A database was created within a scientific cooperation between the municipality of Tuzla and the city of Rotterdam (D.O.O. Mining Institute Tuzla, 2000). The scientific investigation presented in this dissertation has been financially supported by a cooperation project between the Municipality of Tuzla, The University of Bologna (CIRSA) and the Province of Ravenna. The University of Tuzla (RGGF) gave an important scientific support in particular about the geological and hydrogeological features. Subsidence damage resulting from evaporite dissolution generates substantial losses throughout the world, but the causes are only well understood in a few areas (Gutierrez et al., 2008). The subject of this study is the collapsing phenomenon occurring in Tuzla area with the aim to identify and quantify the several factors involved in the system and their correlations. Tuzla subsidence phenomenon can be defined as geohazard, which represents the consequence of an adverse combination of geological processes and ground conditions precipitated by human activity with the potential to cause harm (Rosenbaum and Culshaw, 2003). Where an hazard induces a risk to a vulnerable element, a risk management process is required. The single factors involved in the subsidence of Tuzla can be considered as hazards. The final objective of this dissertation represents a preliminary risk assessment procedure and guidelines, developed in order to quantify the buildings vulnerability in relation to the overall geohazard that affect the town. The historical available database, never fully processed, have been analyzed by means of geographic information systems and mathematical interpolators (PART I). Modern geomatic applications have been implemented to deeply investigate the most relevant hazards (PART II). In order to monitor and quantify the actual subsidence rates, geodetic GPS technologies have been implemented and 4 survey campaigns have been carried out once a year. Subsidence related fractures system has been identified by means of field surveys and mathematical interpretations of the sinking surface, called curvature analysis. The comparison of mapped and predicted fractures leaded to a better comprehension of the problem. Results confirmed the reliability of fractures identification using curvature analysis applied to sinking data instead of topographic or seismic data. Urban changes evolution has been reconstructed analyzing topographic maps and satellite imageries, identifying the most damaged areas. This part of the investigation was very important for the quantification of buildings vulnerability.
Resumo:
The Italian radio telescopes currently undergo a major upgrade period in response to the growing demand for deep radio observations, such as surveys on large sky areas or observations of vast samples of compact radio sources. The optimised employment of the Italian antennas, at first constructed mainly for VLBI activities and provided with a control system (FS – Field System) not tailored to single-dish observations, required important modifications in particular of the guiding software and data acquisition system. The production of a completely new control system called ESCS (Enhanced Single-dish Control System) for the Medicina dish started in 2007, in synergy with the software development for the forthcoming Sardinia Radio Telescope (SRT). The aim is to produce a system optimised for single-dish observations in continuum, spectrometry and polarimetry. ESCS is also planned to be installed at the Noto site. A substantial part of this thesis work consisted in designing and developing subsystems within ESCS, in order to provide this software with tools to carry out large maps, spanning from the implementation of On-The-Fly fast scans (following both conventional and innovative observing strategies) to the production of single-dish standard output files and the realisation of tools for the quick-look of the acquired data. The test period coincided with the commissioning phase for two devices temporarily installed – while waiting for the SRT to be completed – on the Medicina antenna: a 18-26 GHz 7-feed receiver and the 14-channel analogue backend developed for its use. It is worth stressing that it is the only K-band multi-feed receiver at present available worldwide. The commissioning of the overall hardware/software system constituted a considerable section of the thesis work. Tests were led in order to verify the system stability and its capabilities, down to sensitivity levels which had never been reached in Medicina using the previous observing techniques and hardware devices. The aim was also to assess the scientific potential of the multi-feed receiver for the production of wide maps, exploiting its temporary availability on a mid-sized antenna. Dishes like the 32-m antennas at Medicina and Noto, in fact, offer the best conditions for large-area surveys, especially at high frequencies, as they provide a suited compromise between sufficiently large beam sizes to cover quickly large areas of the sky (typical of small-sized telescopes) and sensitivity (typical of large-sized telescopes). The KNoWS (K-band Northern Wide Survey) project is aimed at the realisation of a full-northern-sky survey at 21 GHz; its pilot observations, performed using the new ESCS tools and a peculiar observing strategy, constituted an ideal test-bed for ESCS itself and for the multi-feed/backend system. The KNoWS group, which I am part of, supported the commissioning activities also providing map-making and source-extraction tools, in order to complete the necessary data reduction pipeline and assess the general system scientific capabilities. The K-band observations, which were carried out in several sessions along the December 2008-March 2010 period, were accompanied by the realisation of a 5 GHz test survey during the summertime, which is not suitable for high-frequency observations. This activity was conceived in order to check the new analogue backend separately from the multi-feed receiver, and to simultaneously produce original scientific data (the 6-cm Medicina Survey, 6MS, a polar cap survey to complete PMN-GB6 and provide an all-sky coverage at 5 GHz).
Resumo:
The central objective of research in Information Retrieval (IR) is to discover new techniques to retrieve relevant information in order to satisfy an Information Need. The Information Need is satisfied when relevant information can be provided to the user. In IR, relevance is a fundamental concept which has changed over time, from popular to personal, i.e., what was considered relevant before was information for the whole population, but what is considered relevant now is specific information for each user. Hence, there is a need to connect the behavior of the system to the condition of a particular person and his social context; thereby an interdisciplinary sector called Human-Centered Computing was born. For the modern search engine, the information extracted for the individual user is crucial. According to the Personalized Search (PS), two different techniques are necessary to personalize a search: contextualization (interconnected conditions that occur in an activity), and individualization (characteristics that distinguish an individual). This movement of focus to the individual's need undermines the rigid linearity of the classical model overtaken the ``berry picking'' model which explains that the terms change thanks to the informational feedback received from the search activity introducing the concept of evolution of search terms. The development of Information Foraging theory, which observed the correlations between animal foraging and human information foraging, also contributed to this transformation through attempts to optimize the cost-benefit ratio. This thesis arose from the need to satisfy human individuality when searching for information, and it develops a synergistic collaboration between the frontiers of technological innovation and the recent advances in IR. The search method developed exploits what is relevant for the user by changing radically the way in which an Information Need is expressed, because now it is expressed through the generation of the query and its own context. As a matter of fact the method was born under the pretense to improve the quality of search by rewriting the query based on the contexts automatically generated from a local knowledge base. Furthermore, the idea of optimizing each IR system has led to develop it as a middleware of interaction between the user and the IR system. Thereby the system has just two possible actions: rewriting the query, and reordering the result. Equivalent actions to the approach was described from the PS that generally exploits information derived from analysis of user behavior, while the proposed approach exploits knowledge provided by the user. The thesis went further to generate a novel method for an assessment procedure, according to the "Cranfield paradigm", in order to evaluate this type of IR systems. The results achieved are interesting considering both the effectiveness achieved and the innovative approach undertaken together with the several applications inspired using a local knowledge base.
Resumo:
The identification of people by measuring some traits of individual anatomy or physiology has led to a specific research area called biometric recognition. This thesis is focused on improving fingerprint recognition systems considering three important problems: fingerprint enhancement, fingerprint orientation extraction and automatic evaluation of fingerprint algorithms. An effective extraction of salient fingerprint features depends on the quality of the input fingerprint. If the fingerprint is very noisy, we are not able to detect a reliable set of features. A new fingerprint enhancement method, which is both iterative and contextual, is proposed. This approach detects high-quality regions in fingerprints, selectively applies contextual filtering and iteratively expands like wildfire toward low-quality ones. A precise estimation of the orientation field would greatly simplify the estimation of other fingerprint features (singular points, minutiae) and improve the performance of a fingerprint recognition system. The fingerprint orientation extraction is improved following two directions. First, after the introduction of a new taxonomy of fingerprint orientation extraction methods, several variants of baseline methods are implemented and, pointing out the role of pre- and post- processing, we show how to improve the extraction. Second, the introduction of a new hybrid orientation extraction method, which follows an adaptive scheme, allows to improve significantly the orientation extraction in noisy fingerprints. Scientific papers typically propose recognition systems that integrate many modules and therefore an automatic evaluation of fingerprint algorithms is needed to isolate the contributions that determine an actual progress in the state-of-the-art. The lack of a publicly available framework to compare fingerprint orientation extraction algorithms, motivates the introduction of a new benchmark area called FOE (including fingerprints and manually-marked orientation ground-truth) along with fingerprint matching benchmarks in the FVC-onGoing framework. The success of such framework is discussed by providing relevant statistics: more than 1450 algorithms submitted and two international competitions.
Resumo:
The work presented in this thesis is focused on the open-ended coaxial-probe frequency-domain reflectometry technique for complex permittivity measurement at microwave frequencies of dispersive dielectric multilayer materials. An effective dielectric model is introduced and validated to extend the applicability of this technique to multilayer materials in on-line system context. In addition, the thesis presents: 1) a numerical study regarding the imperfectness of the contact at the probe-material interface, 2) a review of the available models and techniques, 3) a new classification of the extraction schemes with guidelines on how they can be used to improve the overall performance of the probe according to the problem requirements.
Resumo:
Biliary cast syndrome (BCS) is the presence of casts within the intrahepatic or extrahepatic biliary system after orthotopic liver transplantation. Our work compares two percutaneous methods for BCS treatment: the mechanical cast-extraction technique (MCE) versus the hydraulic cast-extraction (HCE) technique using a rheolytic system.
Resumo:
Conventional liquid liquid extraction (LLE) methods require large volumes of fluids to achieve the desired mass transfer of a solute, which is unsuitable for systems dealing with a low volume or high value product. An alternative to these methods is to scale down the process. Millifluidic devices share many of the benefits of microfluidic systems, including low fluid volumes, increased interfacial area-to-volume ratio, and predictability. A robust millifluidic device was created from acrylic, glass, and aluminum. The channel is lined with a hydrogel cured in the bottom half of the device channel. This hydrogel stabilizes co-current laminar flow of immiscible organic and aqueous phases. Mass transfer of the solute occurs across the interface of these contacting phases. Using a y-junction, an aqueous emulsion is created in an organic phase. The emulsion travels through a length of tubing and then enters the co-current laminar flow device, where the emulsion is broken and each phase can be collected separately. The inclusion of this emulsion formation and separation increases the contact area between the organic and aqueous phases, therefore increasing the area over which mass transfer can occur. Using this design, 95% extraction efficiency was obtained, where 100% is represented by equilibrium. By continuing to explore this LLE process, the process can be optimized and with better understanding may be more accurately modeled. This system has the potential to scale up to the industrial level and provide the efficient extraction required with low fluid volumes and a well-behaved system.
Resumo:
In recent years, advanced metering infrastructure (AMI) has been the main research focus due to the traditional power grid has been restricted to meet development requirements. There has been an ongoing effort to increase the number of AMI devices that provide real-time data readings to improve system observability. Deployed AMI across distribution secondary networks provides load and consumption information for individual households which can improve grid management. Significant upgrade costs associated with retrofitting existing meters with network-capable sensing can be made more economical by using image processing methods to extract usage information from images of the existing meters. This thesis presents a new solution that uses online data exchange of power consumption information to a cloud server without modifying the existing electromechanical analog meters. In this framework, application of a systematic approach to extract energy data from images replaces the manual reading process. One case study illustrates the digital imaging approach is compared to the averages determined by visual readings over a one-month period.
Resumo:
The electrolytic refining process, while usually considered an auxiliary process used in conjunction with pyrometallurgical extraction, deserves a special niche in the complex metallurgy of copper. The development of electrolytic copper refining, for example, is largely responsible for the prominence of the electrical industry. Conversely, it could be stated that the electrical industry played an important part in the development of the copper industry.
Resumo:
In this paper, a computer-aided diagnostic (CAD) system for the classification of hepatic lesions from computed tomography (CT) images is presented. Regions of interest (ROIs) taken from nonenhanced CT images of normal liver, hepatic cysts, hemangiomas, and hepatocellular carcinomas have been used as input to the system. The proposed system consists of two modules: the feature extraction and the classification modules. The feature extraction module calculates the average gray level and 48 texture characteristics, which are derived from the spatial gray-level co-occurrence matrices, obtained from the ROIs. The classifier module consists of three sequentially placed feed-forward neural networks (NNs). The first NN classifies into normal or pathological liver regions. The pathological liver regions are characterized by the second NN as cyst or "other disease." The third NN classifies "other disease" into hemangioma or hepatocellular carcinoma. Three feature selection techniques have been applied to each individual NN: the sequential forward selection, the sequential floating forward selection, and a genetic algorithm for feature selection. The comparative study of the above dimensionality reduction methods shows that genetic algorithms result in lower dimension feature vectors and improved classification performance.
Resumo:
For broadcasting purposes MIXED REALITY, the combination of real and virtual scene content, has become ubiquitous nowadays. Mixed Reality recording still requires expensive studio setups and is often limited to simple color keying. We present a system for Mixed Reality applications which uses depth keying and provides threedimensional mixing of real and artificial content. It features enhanced realism through automatic shadow computation which we consider a core issue to obtain realism and a convincing visual perception, besides the correct alignment of the two modalities and correct occlusion handling. Furthermore we present a possibility to support placement of virtual content in the scene. Core feature of our system is the incorporation of a TIME-OF-FLIGHT (TOF)-camera device. This device delivers real-time depth images of the environment at a reasonable resolution and quality. This camera is used to build a static environment model and it also allows correct handling of mutual occlusions between real and virtual content, shadow computation and enhanced content planning. The presented system is inexpensive, compact, mobile, flexible and provides convenient calibration procedures. Chroma-keying is replaced by depth-keying which is efficiently performed on the GRAPHICS PROCESSING UNIT (GPU) by the usage of an environment model and the current ToF-camera image. Automatic extraction and tracking of dynamic scene content is herewith performed and this information is used for planning and alignment of virtual content. An additional sustainable feature is that depth maps of the mixed content are available in real-time, which makes the approach suitable for future 3DTV productions. The presented paper gives an overview of the whole system approach including camera calibration, environment model generation, real-time keying and mixing of virtual and real content, shadowing for virtual content and dynamic object tracking for content planning.
Resumo:
The Agrobacterium tumefaciens VirB/D4 type IV secretion system (T4SS) delivers oncogenic T-DNA and effector proteins to susceptible plant cells. This leads to the formation of tumors termed Crown Galls. The VirB/D4 T4SS is comprised of 12 subunits (VirB1 to VirB11 and VirD4), which assemble to form two structures, a secretion channel spanning the cell envelope and a T-pilus extending from the cell surface. In A. tumefaciens, the VirB2 pilin subunit is required for assembly of the secretion channel and is the main subunit of the T-pilus. The focus of this thesis is to define key reactions associated with the T4SS biogenesis pathway involving the VirB2 pilin. Topology studies demonstrated that VirB2 integrates into the inner membrane with two transmembrane regions, a small cytoplasmic loop, and a long periplasmic loop comprised of covalently linked N and C termini. VirB2 was shown by the substituted cysteine accessibility method (SCAM) to adopt distinct structural states when integrated into the inner membrane and when assembled as a component of the secretion channel and the T-pilus. The VirB4 and VirB11 ATPases were shown by SCAM to modulate the structural state of membrane-integrated VirB2 pilin, and evidence was also obtained that VirB4 mediates extraction of pilin from the membrane. A model that VirB4 functions as a pilin dislocase by an energy-dependent mechanism was further supported by coimmunoprecipitation and osmotic shock studies. Mutational studies identified two regions of VirB10, an N-terminal transmembrane domain and an outer membrane-associated domain termed the antennae projection, that contribute selectively to T-pilus biogenesis. Lastly, characterization of a VirB10 mutant that confers a ‘leaky’ channel phenotype further highlighted the role of VirB10 in gating substrate translocation across the outer membrane as well as T-pilus biogenesis. Results of my studies support a working model in which the VirB4 ATPase catalyzes dislocation of membrane-integrated pilin, and distinct domains of VirB10 coordinate pilin incorporation into the secretion channel and the extracellular T-pilus.