872 resultados para Production methods
Resumo:
The future hydrogen demand is expected to increase, both in existing industries (including upgrading of fossil fuels or ammonia production) and in new technologies, like fuel cells. Nowadays, hydrogen is obtained predominantly by steam reforming of methane, but it is well known that hydrocarbon based routes result in environmental problems and besides the market is dependent on the availability of this finite resource which is suffering of rapid depletion. Therefore, alternative processes using renewable sources like wind, solar energy and biomass, are now being considered for the production of hydrogen. One of those alternative methods is the so-called “steam-iron process” which consists in the reduction of a metal-oxide by hydrogen-containing feedstock, like ethanol for instance, and then the reduced material is reoxidized with water to produce “clean” hydrogen (water splitting). This kind of thermochemical cycles have been studied before but currently some important facts like the development of more active catalysts, the flexibility of the feedstock (including renewable bio-alcohols) and the fact that the purification of hydrogen could be avoided, have significantly increased the interest for this research topic. With the aim of increasing the understanding of the reactions that govern the steam-iron route to produce hydrogen, it is necessary to go into the molecular level. Spectroscopic methods are an important tool to extract information that could help in the development of more efficient materials and processes. In this research, ethanol was chosen as a reducing fuel and the main goal was to study its interaction with different catalysts having similar structure (spinels), to make a correlation with the composition and the mechanism of the anaerobic oxidation of the ethanol which is the first step of the steam-iron cycle. To accomplish this, diffuse reflectance spectroscopy (DRIFTS) was used to study the surface composition of the catalysts during the adsorption of ethanol and its transformation during the temperature program. Furthermore, mass spectrometry was used to monitor the desorbed products. The set of studied materials include Cu, Co and Ni ferrites which were also characterized by means of X-ray diffraction, surface area measurements, Raman spectroscopy, and temperature programmed reduction.
Resumo:
The present work, then, is concerned with the forgotten elements of the Lebanese economy, agriculture and rural development. It investigates the main problematic which arose from these forgotten components, in particular the structure of the agricultural sector, production technology, income distribution, poverty, food security, territorial development and local livelihood strategies. It will do so using quantitative Computable General Equilibrium (CGE) modeling and a qualitative phenomenological case study analysis, both embedded in a critical review of the historical development of the political economy of Lebanon, and a structural analysis of its economy. The research shows that under-development in Lebanese rural areas is not due to lack of resources, but rather is the consequence of political choices. It further suggests that agriculture – in both its mainstream conventional and its innovative locally initiated forms of production – still represents important potential for inducing economic growth and development. In order to do so, Lebanon has to take full advantage of its human and territorial capital, by developing a rural development strategy based on two parallel sets of actions: one directed toward the support of local rural development initiatives, and the other directed toward intensive form of production. In addition to its economic returns, such a strategy would promote social and political stability.
Resumo:
Graphene and graphenic derivatives have rapidly emerged as an extremely promising system for electronic, optical, thermal, and electromechanical applications. Several approaches have been developed to produce these materials (i.e. scotch tape, CVD, chemical and solvent exfoliation). In this work we report a chemical approach to produce graphene by reducing graphene oxide (GO) via thermal or electrical methods. A morphological and electrical characterization of these systems has been performed using different techniques such as SPM, SEM, TEM, Raman and XPS. Moreover, we studied the interaction between graphene derivates and organic molecules focusing on the following aspects: - improvement of optical contrast of graphene on different substrates for rapid monolayer identification1 - supramolecular interaction with organic molecules (i.e. thiophene, pyrene etc.)4 - covalent functionalization with optically active molecules2 - preparation and characterization of organic/graphene Field Effect Transistors3-5 Graphene chemistry can potentially allow seamless integration of graphene technology in organic electronics devices to improve device performance and develop new applications for graphene-based materials. [1] E. Treossi, M. Melucci, A. Liscio, M. Gazzano, P. Samorì, and V. Palermo, J. Am. Chem. Soc., 2009, 131, 15576. [2] M. Melucci, E. Treossi, L. Ortolani, G. Giambastiani, V. Morandi, P. Klar, C. Casiraghi, P. Samorì, and V. Palermo, J. Mater. Chem., 2010, 20, 9052. [3] J.M. Mativetsky, E. Treossi, E. Orgiu, M. Melucci, G.P. Veronese, P. Samorì, and V. Palermo, J. Am. Chem. Soc., 2010, 132, 14130. [4] A. Liscio, G.P. Veronese, E. Treossi, F. Suriano, F. Rossella, V. Bellani, R. Rizzoli, P. Samorì and V. Palermo, J. Mater. Chem., 2011, 21, 2924. [5] J.M. Mativetsky, A. Liscio, E. Treossi, E. Orgiu, A. Zanelli, P. Samorì , V. Palermo, J. Am. Chem. Soc., 2011, 133, 14320
Resumo:
This work presents exact algorithms for the Resource Allocation and Cyclic Scheduling Problems (RA&CSPs). Cyclic Scheduling Problems arise in a number of application areas, such as in hoist scheduling, mass production, compiler design (implementing scheduling loops on parallel architectures), software pipelining, and in embedded system design. The RA&CS problem concerns time and resource assignment to a set of activities, to be indefinitely repeated, subject to precedence and resource capacity constraints. In this work we present two constraint programming frameworks facing two different types of cyclic problems. In first instance, we consider the disjunctive RA&CSP, where the allocation problem considers unary resources. Instances are described through the Synchronous Data-flow (SDF) Model of Computation. The key problem of finding a maximum-throughput allocation and scheduling of Synchronous Data-Flow graphs onto a multi-core architecture is NP-hard and has been traditionally solved by means of heuristic (incomplete) algorithms. We propose an exact (complete) algorithm for the computation of a maximum-throughput mapping of applications specified as SDFG onto multi-core architectures. Results show that the approach can handle realistic instances in terms of size and complexity. Next, we tackle the Cyclic Resource-Constrained Scheduling Problem (i.e. CRCSP). We propose a Constraint Programming approach based on modular arithmetic: in particular, we introduce a modular precedence constraint and a global cumulative constraint along with their filtering algorithms. Many traditional approaches to cyclic scheduling operate by fixing the period value and then solving a linear problem in a generate-and-test fashion. Conversely, our technique is based on a non-linear model and tackles the problem as a whole: the period value is inferred from the scheduling decisions. The proposed approaches have been tested on a number of non-trivial synthetic instances and on a set of realistic industrial instances achieving good results on practical size problem.
Resumo:
Foodborne diseases impact human health and economies worldwide in terms of health care and productivity loss. Prevention is necessary and methods to detect, isolate and quantify foodborne pathogens play a fundamental role, changing continuously to face microorganisms and food production evolution. Official methods are mainly based on microorganisms growth in different media and their isolation on selective agars followed by confirmation of presumptive colonies through biochemical and serological test. A complete identification requires form 7 to 10 days. Over the last decades, new molecular techniques based on antibodies and nucleic acids allow a more accurate typing and a faster detection and quantification. The present thesis aims to apply molecular techniques to improve official methods performances regarding two pathogens: Shiga-like Toxin-producing Escherichia coli (STEC) and Listeria monocytogenes. In 2011, a new strain of STEC belonging to the serogroup O104 provoked a large outbreak. Therefore, the development of a method to detect and isolate STEC O104 is demanded. The first objective of this work is the detection, isolation and identification of STEC O104 in sprouts artificially contaminated. Multiplex PCR assays and antibodies anti-O104 incorporated in reagents for immunomagnetic separation and latex agglutination were employed. Contamination levels of less than 1 CFU/g were detected. Multiplex PCR assays permitted a rapid screening of enriched food samples and identification of isolated colonies. Immunomagnetic separation and latex agglutination allowed a high sensitivity and rapid identification of O104 antigen, respectively. The development of a rapid method to detect and quantify Listeria monocytogenes, a high-risk pathogen, is the second objective. Detection of 1 CFU/ml and quantification of 10–1,000 CFU/ml in raw milk were achieved by a sample pretreatment step and quantitative PCR in about 3h. L. monocytogenes growth in raw milk was also evaluated.
Resumo:
The production of the Z boson in proton-proton collisions at the LHC serves as a standard candle at the ATLAS experiment during early data-taking. The decay of the Z into an electron-positron pair gives a clean signature in the detector that allows for calibration and performance studies. The cross-section of ~ 1 nb allows first LHC measurements of parton density functions. In this thesis, simulations of 10 TeV collisions at the ATLAS detector are studied. The challenges for an experimental measurement of the cross-section with an integrated luminositiy of 100 pb−1 are discussed. In preparation for the cross-section determination, the single-electron efficiencies are determined via a simulation based method and in a test of a data-driven ansatz. The two methods show a very good agreement and differ by ~ 3% at most. The ingredients of an inclusive and a differential Z production cross-section measurement at ATLAS are discussed and their possible contributions to systematic uncertainties are presented. For a combined sample of signal and background the expected uncertainty on the inclusive cross-section for an integrated luminosity of 100 pb−1 is determined to 1.5% (stat) +/- 4.2% (syst) +/- 10% (lumi). The possibilities for single-differential cross-section measurements in rapidity and transverse momentum of the Z boson, which are important quantities because of the impact on parton density functions and the capability to check for non-pertubative effects in pQCD, are outlined. The issues of an efficiency correction based on electron efficiencies as function of the electron’s transverse momentum and pseudorapidity are studied. A possible alternative is demonstrated by expanding the two-dimensional efficiencies with the additional dimension of the invariant mass of the two leptons of the Z decay.
Resumo:
The present study is focused on the development of new VIII group metal on CeO2 – ZrO2 (CZO) catalyst to be used in reforming reaction for syngas production. The catalyst are tested in the oxyreforming process, extensively studied by Barbera [44] in a new multistep process configuration, with intermediate H2 membrane separation, that can be carried out at lower temperature (750°C) with respect the reforming processes (900 – 1000°C). In spite of the milder temperatures, the oxy-reforming conditions (S/C = 0.7; O2/C = 0.21) remain critical regarding the deactivation problems mainly deriving from thermal sintering and carbon formation phenomena. The combination of the high thermal stability characterizing the ZrO2, with the CeO2 redox properties, allows the formation of stable mixed oxide system with high oxygen mobility. This feature can be exploited in order to contrast the carbon deposition on the active metal surface through the oxidation of the carbon by means of the mobile oxygen atoms available at the surface of the CZO support. Ce0.5Zr0.5O2 is the phase claimed to have the highest oxygen mobility but its formation is difficult through classical synthesis (co-precipitation), hence a water-in-oil microemulsion method is, widely studied and characterized. Two methods (IWI and bulk) for the insertion of the active metal (Rh, Ru, Ni) are followed and their effects, mainly related to the metal stability and dispersion on the support, are discussed, correlating the characterization with the catalytic activity. Different parameters (calcination and reduction temperatures) are tuned to obtain the best catalytic system both in terms of activity and stability. Interesting results are obtained with impregnated and bulk catalysts, the latter representing a new class of catalysts. The best catalysts are also tested in a low temperature (350 – 500°C) steam reforming process and preliminary tests with H2 membrane separation have been also carried out.
Resumo:
This work deals with the car sequencing (CS) problem, a combinatorial optimization problem for sequencing mixed-model assembly lines. The aim is to find a production sequence for different variants of a common base product, such that work overload of the respective line operators is avoided or minimized. The variants are distinguished by certain options (e.g., sun roof yes/no) and, therefore, require different processing times at the stations of the line. CS introduces a so-called sequencing rule H:N for each option, which restricts the occurrence of this option to at most H in any N consecutive variants. It seeks for a sequence that leads to no or a minimum number of sequencing rule violations. In this work, CS’ suitability for workload-oriented sequencing is analyzed. Therefore, its solution quality is compared in experiments to the related mixed-model sequencing problem. A new sequencing rule generation approach as well as a new lower bound for the problem are presented. Different exact and heuristic solution methods for CS are developed and their efficiency is shown in experiments. Furthermore, CS is adjusted and applied to a resequencing problem with pull-off tables.
Resumo:
In this thesis we investigate several phenomenologically important properties of top-quark pair production at hadron colliders. We calculate double differential cross sections in two different kinematical setups, pair invariant-mass (PIM) and single-particle inclusive (1PI) kinematics. In pair invariant-mass kinematics we are able to present results for the double differential cross section with respect to the invariant mass of the top-quark pair and the top-quark scattering angle. Working in the threshold region, where the pair invariant mass M is close to the partonic center-of-mass energy sqrt{hat{s}}, we are able to factorize the partonic cross section into different energy regions. We use renormalization-group (RG) methods to resum large threshold logarithms to next-to-next-to-leading-logarithmic (NNLL) accuracy. On a technical level this is done using effective field theories, such as heavy-quark effective theory (HQET) and soft-collinear effective theory (SCET). The same techniques are applied when working in 1PI kinematics, leading to a calculation of the double differential cross section with respect to transverse-momentum pT and the rapidity of the top quark. We restrict the phase-space such that only soft emission of gluons is possible, and perform a NNLL resummation of threshold logarithms. The obtained analytical expressions enable us to precisely predict several observables, and a substantial part of this thesis is devoted to their detailed phenomenological analysis. Matching our results in the threshold regions to the exact ones at next-to-leading order (NLO) in fixed-order perturbation theory, allows us to make predictions at NLO+NNLL order in RG-improved, and at approximate next-to-next-to-leading order (NNLO) in fixed order perturbation theory. We give numerical results for the invariant mass distribution of the top-quark pair, and for the top-quark transverse-momentum and rapidity spectrum. We predict the total cross section, separately for both kinematics. Using these results, we analyze subleading contributions to the total cross section in 1PI and PIM originating from power corrections to the leading terms in the threshold expansions, and compare them to previous approaches. We later combine our PIM and 1PI results for the total cross section, this way eliminating uncertainties due to these corrections. The combined predictions for the total cross section are presented as a function of the top-quark mass in the pole, the minimal-subtraction (MS), and the 1S mass scheme. In addition, we calculate the forward-backward (FB) asymmetry at the Tevatron in the laboratory, and in the ttbar rest frames as a function of the rapidity and the invariant mass of the top-quark pair at NLO+NNLL. We also give binned results for the asymmetry as a function of the invariant mass and the rapidity difference of the ttbar pair, and compare those to recent measurements. As a last application we calculate the charge asymmetry at the LHC as a function of a lower rapidity cut-off for the top and anti-top quarks.
Resumo:
The increase in aquaculture operations worldwide has provided new opportunities for the transmission of aquatic viruses. The occurrence of viral diseases remains a significant limiting factor in aquaculture production and for the sustainability. The ability to identify quickly the presence/absence of a pathogenic organism in fish would have significant advantages for the aquaculture systems. Several molecular methods have found successful application in fish pathology both for confirmatory diagnosis of overt diseases and for detection of asymptomatic infections. However, a lot of different variants occur among fish host species and virus strains and consequently specific methods need to be developed and optimized for each pathogen and often also for each host species. The first chapter of this PhD thesis presents a complete description of the major viruses that infect fish and provides a relevant information regarding the most common methods and emerging technologies for the molecular diagnosis of viral diseases of fish. The development and application of a real time PCR assay for the detection and quantification of lymphocystivirus was described in the second chapter. It showed to be highly sensitive, specific, reproducible and versatile for the detection and quantitation of lymphocystivirus. The use of this technique can find multiple application such as asymptomatic carrier detection or pathogenesis studies of different LCDV strains. The third chapter, a multiplex RT-PCR (mRT-PCR) assay was developed for the simultaneous detection of viral haemorrhagic septicaemia (VHS), infectious haematopoietic necrosis (IHN), infectious pancreatic necrosis (IPN) and sleeping disease (SD) in a single assay. This method was able to efficiently detect the viral RNA in tissue samples, showing the presence of single infections and co-infections in rainbow trout samples. The mRT-PCR method was revealed to be an accurate and fast method to support traditional diagnostic techniques in the diagnosis of major viral diseases of rainbow trout.
Resumo:
This doctorate was funded by the Regione Emilia Romagna, within a Spinner PhD project coordinated by the University of Parma, and involving the universities of Bologna, Ferrara and Modena. The aim of the project was: - Production of polymorphs, solvates, hydrates and co-crystals of active pharmaceutical ingredients (APIs) and agrochemicals with green chemistry methods; - Optimization of molecular and crystalline forms of APIs and pesticides in relation to activity, bioavailability and patentability. In the last decades, a growing interest in the solid-state properties of drugs in addition to their solution chemistry has blossomed. The achievement of the desired and/or the more stable polymorph during the production process can be a challenge for the industry. The study of crystalline forms could be a valuable step to produce new polymorphs and/or co-crystals with better physical-chemical properties such as solubility, permeability, thermal stability, habit, bulk density, compressibility, friability, hygroscopicity and dissolution rate in order to have potential industrial applications. Selected APIs (active pharmaceutical ingredients) were studied and their relationship between crystal structure and properties investigated, both in the solid state and in solution. Polymorph screening and synthesis of solvates and molecular/ionic co-crystals were performed according to green chemistry principles. Part of this project was developed in collaboration with chemical/pharmaceutical companies such as BASF (Germany) and UCB (Belgium). We focused on on the optimization of conditions and parameters of crystallization processes (additives, concentration, temperature), and on the synthesis and characterization of ionic co-crystals. Moreover, during a four-months research period in the laboratories of Professor Nair Rodriguez-Hormedo (University of Michigan), the stability in aqueous solution at the equilibrium of ionic co-crystals (ICCs) of the API piracetam was investigated, to understand the relationship between their solid-state and solution properties, in view of future design of new crystalline drugs with predefined solid and solution properties.
Resumo:
The Large Hadron Collider, located at the CERN laboratories in Geneva, is the largest particle accelerator in the world. One of the main research fields at LHC is the study of the Higgs boson, the latest particle discovered at the ATLAS and CMS experiments. Due to the small production cross section for the Higgs boson, only a substantial statistics can offer the chance to study this particle properties. In order to perform these searches it is desirable to avoid the contamination of the signal signature by the number and variety of the background processes produced in pp collisions at LHC. Much account assumes the study of multivariate methods which, compared to the standard cut-based analysis, can enhance the signal selection of a Higgs boson produced in association with a top quark pair through a dileptonic final state (ttH channel). The statistics collected up to 2012 is not sufficient to supply a significant number of ttH events; however, the methods applied in this thesis will provide a powerful tool for the increasing statistics that will be collected during the next LHC data taking.
Resumo:
BACKGROUND: Only few standardized apraxia scales are available and they do not cover all domains and semantic features of gesture production. Therefore, the objective of the present study was to evaluate the reliability and validity of a newly developed test of upper limb apraxia (TULIA), which is comprehensive and still short to administer. METHODS: The TULIA consists of 48 items including imitation and pantomime domain of non-symbolic (meaningless), intransitive (communicative) and transitive (tool related) gestures corresponding to 6 subtests. A 6-point scoring method (0-5) was used (score range 0-240). Performance was assessed by blinded raters based on videos in 133 stroke patients, 84 with left hemisphere damage (LHD) and 49 with right hemisphere damage (RHD), as well as 50 healthy subjects (HS). RESULTS: The clinimetric findings demonstrated mostly good to excellent internal consistency, inter- and intra-rater (test-retest) reliability, both at the level of the six subtests and at individual item level. Criterion validity was evaluated by confirming hypotheses based on the literature. Construct validity was demonstrated by a high correlation (r = 0.82) with the De Renzi-test. CONCLUSION: These results show that the TULIA is both a reliable and valid test to systematically assess gesture production. The test can be easily applied and is therefore useful for both research purposes and clinical practice.
Resumo:
Plant volatiles typically occur as a complex mixture of low-molecular weight lipophilic compounds derived from different biosynthetic pathways, and are seemingly produced as part of a defense strategy against biotic and abiotic stress, as well as contributing to various physiological functions of the producer organism. The biochemistry and molecular biology of plant volatiles is complex, and involves the interplay of several biochemical pathways and hundreds of genes. All plants are able to store and emit volatile organic compounds (VOCs), but the process shows remarkable genotypic variation and phenotypic plasticity. From a physiological standpoint, plant volatiles are involved in three critical processes, namely plant–plant interaction, the signaling between symbiotic organisms, and the attraction of pollinating insects. Their role in these ‘‘housekeeping’’ activities underlies agricultural applications that range from the search for sustainable methods for pest control to the production of flavors and fragrances. On the other hand, there is also growing evidence that VOCs are endowed with a range of biological activities in mammals, and that they represent a substantially under-exploited and still largely untapped source of novel drugs and drug leads. This review summarizes recent major developments in the study of biosynthesis, ecological functions and medicinal applications of plant VOCs.
Resumo:
A custom-made 228Th source of several MBq activity was produced for the Borexino experiment for studying the external background of the detector. The aim was to reduce the unwanted neutron emission produced via (alpha,n) reactions in ceramics used typically for commercial 228Th sources. For this purpose a ThCl4 solution was converted chemically into ThO2 and embedded into a gold foil. The paper describes the production and the characterization of the custom-made source by means of gamma-activity, dose rate and neutron source strength measurements. From gamma-spectroscopic measurements it was deduced that the activity transfer from the initial solution to the final source was >91% (at 68% C.L.) and the final activity was (5.41+-0.30) MBq. The dose rate was measured by two dosimeters yielding 12.1 mSv/h and 14.3 mSv/h in 1 cm distance. The neutron source strength of the 5.41 MBq 228Th source was determined as (6.59+-0.85)/sec.