958 resultados para digitalization of cuny special collection
Resumo:
The main goal of this special issue was to gather contributions dealing with the latest breakthrough methods for providing value compounds and energy/fuel from waste valorization. Valorization is a relatively new approach in the area of industrial wastes management, a key issue to promote sustainable development. In this field, the recovery of value-added substances, such as antioxidants, proteins, vitamins, and so forth, from the processing of agroindustrial byproducts, is worth mentioning. Another important valorization approach is the use of biogas from waste treatment plants for the production of energy. Several approaches involving physical and chemical processes, thermal and biological processes that ensure reduced emissions and energy consumptions were taken into account. The papers selected for this topical issue represent some of the mostly researched methods that currently promote the valorization of wastes to energy and useful materials ...
Resumo:
Soluble MHC-peptide complexes, commonly known as tetramers, allow the detection and isolation of antigen-specific T cells. Although other types of soluble MHC-peptide complexes have been introduced, the most commonly used MHC class I staining reagents are those originally described by Altman and Davis. As these reagents have become an essential tool for T cell analysis, it is important to have a large repertoire of such reagents to cover a broad range of applications in cancer research and clinical trials. Our tetramer collection currently comprises 228 human and 60 mouse tetramers and new reagents are continuously being added. For the MHC II tetramers, the list currently contains 21 human (HLA-DR, DQ and DP) and 5 mouse (I-A(b)) tetramers. Quantitative enumeration of antigen-specific T cells by tetramer staining, especially at low frequencies, critically depends on the quality of the tetramers and on the staining procedures. For conclusive longitudinal monitoring, standardized reagents and analysis protocols need to be used. This is especially true for the monitoring of antigen-specific CD4+ T cells, as there are large variations in the quality of MHC II tetramers and staining conditions. This commentary provides an overview of our tetramer collection and indications on how tetramers should be used to obtain optimal results.
Resumo:
The present prospective study, with a five-year follow-up, presents an extensive psychiatric and educational assessment of an adolescent population (N = 30) in the age range 14-20, suffering from several psychiatric disorders, though apt to follow a normal academic program. The residential settings where the study took place provide both psychiatric and schooling facilities. In this environment, what is the effectiveness of long-term hospitalization? Are there any criteria for predicting results? After discharge, could social adjustments difficulties be prevented? Assessment instruments are described and the results of one preliminary study are presented. The actual data seems to confirm the impact of the special treatment facilities combining schooling and psychiatric settings on the long term outcome of adolescents.
Resumo:
This report is submitted in compliance with Administrative Rule 761-401.18(3), Discontinuance, and discusses seven special registration plates established by the legislature in 2011. This rule requires the department to report to the legislature if any of the special registration plates subject to this rule have not been placed into production because the department has not received the minimum number of paid applications (250) required to produce the plates under Iowa Code section 321.34, subsections 20C, 25 and 26 as amended by 2011 Iowa Acts, House File 651, section 2.
Resumo:
Viruses are known to tolerate wide ranges of pH and salt conditions and to withstand internal pressures as high as 100 atmospheres. In this paper we investigate the mechanical properties of viral capsids, calling explicit attention to the inhomogeneity of the shells that is inherent to their discrete and polyhedral nature. We calculate the distribution of stress in these capsids and analyze their response to isotropic internal pressure (arising, for instance, from genome confinement and/or osmotic activity). We compare our results with appropriate generalizations of classical (i.e., continuum) elasticity theory. We also examine competing mechanisms for viral shell failure, e.g., in-plane crack formation vs radial bursting. The biological consequences of the special stabilities and stress distributions of viral capsids are also discussed.
Resumo:
Spatial data analysis mapping and visualization is of great importance in various fields: environment, pollution, natural hazards and risks, epidemiology, spatial econometrics, etc. A basic task of spatial mapping is to make predictions based on some empirical data (measurements). A number of state-of-the-art methods can be used for the task: deterministic interpolations, methods of geostatistics: the family of kriging estimators (Deutsch and Journel, 1997), machine learning algorithms such as artificial neural networks (ANN) of different architectures, hybrid ANN-geostatistics models (Kanevski and Maignan, 2004; Kanevski et al., 1996), etc. All the methods mentioned above can be used for solving the problem of spatial data mapping. Environmental empirical data are always contaminated/corrupted by noise, and often with noise of unknown nature. That's one of the reasons why deterministic models can be inconsistent, since they treat the measurements as values of some unknown function that should be interpolated. Kriging estimators treat the measurements as the realization of some spatial randomn process. To obtain the estimation with kriging one has to model the spatial structure of the data: spatial correlation function or (semi-)variogram. This task can be complicated if there is not sufficient number of measurements and variogram is sensitive to outliers and extremes. ANN is a powerful tool, but it also suffers from the number of reasons. of a special type ? multiplayer perceptrons ? are often used as a detrending tool in hybrid (ANN+geostatistics) models (Kanevski and Maignank, 2004). Therefore, development and adaptation of the method that would be nonlinear and robust to noise in measurements, would deal with the small empirical datasets and which has solid mathematical background is of great importance. The present paper deals with such model, based on Statistical Learning Theory (SLT) - Support Vector Regression. SLT is a general mathematical framework devoted to the problem of estimation of the dependencies from empirical data (Hastie et al, 2004; Vapnik, 1998). SLT models for classification - Support Vector Machines - have shown good results on different machine learning tasks. The results of SVM classification of spatial data are also promising (Kanevski et al, 2002). The properties of SVM for regression - Support Vector Regression (SVR) are less studied. First results of the application of SVR for spatial mapping of physical quantities were obtained by the authorsin for mapping of medium porosity (Kanevski et al, 1999), and for mapping of radioactively contaminated territories (Kanevski and Canu, 2000). The present paper is devoted to further understanding of the properties of SVR model for spatial data analysis and mapping. Detailed description of the SVR theory can be found in (Cristianini and Shawe-Taylor, 2000; Smola, 1996) and basic equations for the nonlinear modeling are given in section 2. Section 3 discusses the application of SVR for spatial data mapping on the real case study - soil pollution by Cs137 radionuclide. Section 4 discusses the properties of the modelapplied to noised data or data with outliers.
Resumo:
A quarterly journal of Iowa authors and their works produced by State Historical Society of Iowa. Special Commemorative Issue: Mormon Handcraft Trek
Resumo:
Diabetes is a recognized risk factor for cardiovascular diseases and heart failure. Diabetic cardiovascular dysfunction also underscores the development of diabetic retinopathy, nephropathy and neuropathy. Despite the broad availability of antidiabetic therapy, glycemic control still remains a major challenge in the management of diabetic patients. Hyperglycemia triggers formation of advanced glycosylation end products (AGEs), activates protein kinase C, enhances polyol pathway, glucose autoxidation, which coupled with elevated levels of free fatty acids, and leptin have been implicated in increased generation of superoxide anion by mitochondria, NADPH oxidases and xanthine oxidoreductase in diabetic vasculature and myocardium. Superoxide anion interacts with nitric oxide forming the potent toxin peroxynitrite via diffusion limited reaction, which in concert with other oxidants triggers activation of stress kinases, endoplasmic reticulum stress, mitochondrial and poly(ADP-ribose) polymerase 1-dependent cell death, dysregulates autophagy/mitophagy, inactivates key proteins involved in myocardial calcium handling/contractility and antioxidant defense, activates matrix metalloproteinases and redox-dependent pro-inflammatory transcription factors (e.g. nuclear factor kappaB) promoting inflammation, AGEs formation, eventually culminating in myocardial dysfunction, remodeling and heart failure. Understanding the complex interplay of oxidative/nitrosative stress with pro-inflammatory, metabolic and cell death pathways is critical to devise novel targeted therapies for diabetic cardiomyopathy, which will be overviewed in this brief synopsis. This article is part of a Special Issue entitled: Autophagy and protein quality control in cardiometabolic diseases.
Resumo:
Työn päätavoitteena on arvioida hissien automaattiovien usein toistuvien asiakaskohtaisesti räätälöityjen tuoteominaisuuksien vakioimisen kannattavuutta KONEen Hyvinkään tehtaalla. Työssä arvioidaan myös massaräätälöinnin periaatteiden soveltuvuutta erikoisovien toimitusprosessiin sekä toimintoperusteisen kustannustiedon tarpeellisuutta tuoteominaisuuksia koskevien päätösten teossa. Asiakaskohtaisesti räätälöity ovituotanto jaettiin kategorioihin, joiden volyymeja ja tuotemuutosten tekoon käytettyjä suunnittelutuntimääriä analysoimalla selvitettiin, muodostaako jokin räätälöitävä tuoteominaisuus riittävän suuren kappalemääräisen kysynnän tai suunnittelutuntien kulutuskohteen, jotta ominaisuuden vakiointi olisi kannattavaa. Ovien kustannusrakenne-esimerkkien avulla selvitettiin myös suunnittelun osuus kokonaiskustannuksista. Analyysien perusteella todettiin asiakaskohtaisten muutosten olevan hajanaisia ja vain muutaman ominaisuuden osalta muodostuvan tarpeeksi suuri ja yhtenäinen kokonaisuus, joka olisi taloudellisesti kannattavaavakioida. Monimutkaisen tuoterakenteen vuoksi massaräätälöinnin opit eivät ole suoraan kopioitavissa hissien erikoisovien toimitusprosessiin, mutta sisältävät hyödyllisiä toimintamalleja prosessin kehittämiseen.
Resumo:
Tutkimus koostuu neljästä artikkelista, jotka käsittelevät suomalaisten pienten ja keskisuurten teollisuusyritysten (PKT-yritysten) innovatiivisuutta, sen attribuutteja (ominaispiirteitä) sekä indikaattoreita. Tutkimuksessa tarkastellaan sekä kirjallisuudessa esitettyjä että PKT-johtajien ja PKT-yritystenkehityshankkeiden rahoituspäätöksiin osallistuvien yritystutkijoiden haastatteluissa esittämiä innovatiivisuuden määritelmiä. Innovatiivisuusindikaattoreista tarkastellaan PKT-yritysten kehittämishankkeille sovellettavia rahoitus- ja arviointikriteerejä sekä yritysten ulkopuolisten rahoittajien että PKT-johtajien näkökulmasta. Erityistä huomiota kohdistetaan sovellettuihin laadullisiin ja ei-numeerisiin innovatiivisuuden arviointikriteereihin. Sekä kirjallisuudessa että kymmenen yritystutkijan ja kuuden esimerkkiyrityksen johtajan haastatteluissa innovaation uutuus yhdistetään innovatiivisuuteen. Muita tärkeitä innovatiivisuuteen liitettyjä ominaisuuksia olivat markkinat, muista yrityksistä erottuminen ja yksilöiden luovuus. Ihmisläheiset ja yksilöihin liittyvät näkökulmat korostuvat yritystutkijoiden ja PKT-johtajien innovatiivisuuden määritelmissä, kun taas kirjallisuudessa annetaan enemmän painoa ympäristölle, tuotteille ja markkinoille. Yritystutkijat pitivät yritykseen ja sen johtajaan liittyviä tekijöitä tärkeinä rahoitettavien kehittämishankkeiden panosten arviointikriteereinä. Tuotteiden kaupallinen menestys oli rahoittajan kannalta tärkein tulostekijä. Tarkastelluissa esimerkkiyrityksissä kehityshankkeista päättäminen ja hankkeiden arviointi on toisaalta intuitiivista ja saattaa olla tiedostamatontakin, koska yritysten kehittämistoiminta on vähäistä. Pienyritysten johtajat korostavat arvioinnissa rahallisiamittareita, vaikka sekä numeerisia että laadullisia kriteereitä sovelletaan. Todennäköisin syy tälle on pienyritysten rajalliset taloudelliset voimavarat. Toinen mahdollinen syy rahoituksellisten tekijöiden painottamiseen on, että tämän päivän ihannejohtaja ymmärretään analyyttiseksi ja mm.rahavirtoja valvovaksi. Kuitenkin innovatiiviset yritysjohtajat pitävät innovaatioiden luomista yhtenä elämän hauskoista puolista. Innovatiiviset esimerkkiyritykset ovat tulevaisuuteen ja kasvuun suuntautuneita strategisella tasolla. Operationaalisella tasolla ne tuottavat keksintöjä ja innovaatioita. Patentteja tarkastelluilla yrityksillä on kuitenkin vähän. Sekä innovatiiviset että vähemmän innovatiiviset esimerkkiyritykset ovat voimakkaasti asiakassuuntautuneita ja erikoistuneita tiettyihin tuotteisiin ja asiakkaisiin. Asiakkaiden tarpeita tyydytetään kehittämällä niitä vastaavia tuotteita. Tästä johtuu, että valtaosa yritysten kehittämistoiminnasta kohdistuu tuotteisiin tai tuotantoon.
Resumo:
Reference collections of multiple Drosophila lines with accumulating collections of "omics" data have proven especially valuable for the study of population genetics and complex trait genetics. Here we present a description of a resource collection of 84 strains of Drosophila melanogaster whose genome sequences were obtained after 12 generations of full-sib inbreeding. The initial rationale for this resource was to foster development of a systems biology platform for modeling metabolic regulation by the use of natural polymorphisms as perturbations. As reference lines, they are amenable to repeated phenotypic measurements, and already a large collection of metabolic traits have been assayed. Another key feature of these strains is their widespread geographic origin, coming from Beijing, Ithaca, Netherlands, Tasmania, and Zimbabwe. After obtaining 12.5× coverage of paired-end Illumina sequence reads, SNP and indel calls were made with the GATK platform. Thorough quality control was enabled by deep sequencing one line to >100×, and single-nucleotide polymorphisms and indels were validated using ddRAD-sequencing as an orthogonal platform. In addition, a series of preliminary population genetic tests were performed with these single-nucleotide polymorphism data for assessment of data quality. We found 83 segregating inversions among the lines, and as expected these were especially abundant in the African sample. We anticipate that this will make a useful addition to the set of reference D. melanogaster strains, thanks to its geographic structuring and unusually high level of genetic diversity.
Resumo:
This thesis describes the development of advanced silicon radiation detectors and their characterization by simulations, used in the work for searching elementary particles in the European Organization for Nuclear Research, CERN. Silicon particle detectors will face extremely harsh radiation in the proposed upgrade of the Large Hadron Collider, the future high-energy physics experiment Super-LHC. The increase in the maximal fluence and the beam luminosity up to 1016 neq / cm2 and 1035 cm-2s-1 will require detectors with a dramatic improvement in radiation hardness, when such a fluence will be far beyond the operational limits of the present silicon detectors. The main goals of detector development concentrate on minimizing the radiation degradation. This study contributes mainly to the device engineering technology for developing more radiation hard particle detectors with better characteristics. Also the defect engineering technology is discussed. In the nearest region of the beam in Super-LHC, the only detector choice is 3D detectors, or alternatively replacing other types of detectors every two years. The interest in the 3D silicon detectors is continuously growing because of their many advantages as compared to conventional planar detectors: the devices can be fully depleted at low bias voltages, the speed of the charge collection is high, and the collection distances are about one order of magnitude less than those of planar technology strip and pixel detectors with electrodes limited to the detector surface. Also the 3D detectors exhibit high radiation tolerance, and thus the ability of the silicon detectors to operate after irradiation is increased. Two parameters, full depletion voltage and electric field distribution, is discussed in more detail in this study. The full depletion of the detector is important because the only depleted area in the detector is active for the particle tracking. Similarly, the high electric field in the detector makes the detector volume sensitive, while low-field areas are non-sensitive to particles. This study shows the simulation results of full depletion voltage and the electric field distribution for the various types of 3D detectors. First, the 3D detector with the n-type substrate and partial-penetrating p-type electrodes are researched. A detector of this type has a low electric field on the pixel side and it suffers from type inversion. Next, the substrate is changed to p-type and the detectors having electrodes with one doping type and the dual doping type are examined. The electric field profile in a dual-column 3D Si detector is more uniform than that in the single-type column 3D detector. The dual-column detectors are the best in radiation hardness because of their low depletion voltages and short drift distances.
Resumo:
Postmortem imaging consists in the non-invasive examination of bodies using medical imaging techniques. However, gas volume quantification and the interpretation of the gas collection results from cadavers remain difficult. We used whole-body postmortem multi-detector computed tomography (MDCT) followed by a full autopsy or external examination to detect the gaseous volumes in bodies. Gases were sampled from cardiac cavities, and the sample compositions were analyzed by headspace gas chromatography-mass spectrometry/thermal conductivity detection (HS-GC-MS/TCD). Three categories were defined according to the presumed origin of the gas: alteration/putrefaction, high-magnitude vital gas embolism (e.g., from scuba diving accident) and gas embolism of lower magnitude (e.g., following a traumatic injury). Cadaveric alteration gas was diagnosed even if only one gas from among hydrogen, hydrogen sulfide or methane was detected. In alteration cases, the carbon dioxide/nitrogen ratio was often >0.2, except in the case of advanced alteration, when methane presence was the best indicator. In the gas embolism cases (vital or not), hydrogen, hydrogen sulfide and methane were absent. Moreover, with high-magnitude vital gas embolisms, carbon dioxide content was >20%, and the carbon dioxide/nitrogen ratio was >0.2. With gas embolisms of lower magnitude (gas presence consecutive to a traumatic injury), carbon dioxide content was <20% and the carbon dioxide/nitrogen ratio was often <0.2. We found that gas analysis provided useful assistance to the postmortem imaging diagnosis of causes of death. Based on the quantifications of gaseous cardiac samples, reliable indicators were determined to document causes of death. MDCT examination of the body must be performed as quickly as possible, as does gas sampling, to avoid generating any artifactual alteration gases. Because of cardiac gas composition analysis, it is possible to distinguish alteration gases and gas embolisms of different magnitudes.
Resumo:
The networking and digitalization of audio equipment has created a need for control protocols. These protocols offer new services to customers and ensure that the equipment operates correctly. The control protocols used in the computer networks are not directly applicable since embedded systems have resource and cost limitations. In this master's thesis the design and implementation of new loudspeaker control network protocols are presented. The protocol stack was required to be reliable, have short response times, configure the network automatically and support the dynamic addition and removal of loudspeakers. The implemented protocol stack was also required to be as efficient and lightweight as possible because the network nodes are fairly simple and lack processing power. The protocol stack was thoroughly tested, validated and verified. The protocols were formally described using LOTOS (Language of Temporal Ordering Specifications) and verified using reachability analysis. A prototype of the loudspeaker network was built and used for testing the operation and the performance of the control protocols. The implemented control protocol stack met the design specifications and proved to be highly reliable and efficient.