917 resultados para post-processing method
Resumo:
Peat deposits in Greenland and Denmark were investigated to show that high-resolution dating of these archives of atmospheric deposition can be provided for the last 50 years by radiocarbon dating using the atmospheric bomb pulse. (super 14) C was determined in macrofossils from sequential one cm slices using accelerator mass spectrometry (AMS). Values were calibrated with a general-purpose curve derived from annually averaged atmospheric (super 14) CO (sub 2) values in the northernmost northern hemisphere (NNH, 30 degrees -90 degrees N). We present a through review of (super 14) C bomb-pulse data from the NNH including our own measurements made in tree rings and seeds from Arizona as well as other previously published data. We show that our general-purpose calibration curve is valid for the whole NNH producing accurate dates within 1-2 years. In consequence, (super 14) C AMS can precisely date individual points in recent peat deposits within the range of the bomb-pulse (from the mid-1950s on). Comparing the (super 14) C AMS results with the customary dating method for recent peat profiles by (super 210) Pb, we show that the use of (super 137) Cs to validate and correct (super 210) Pb dates proves to be more problematic than previously supposed. As a unique example of our technique, we show how this chronometer can be applied to identify temporal changes in Hg concentrations from Danish and Greenland peat cores.
Resumo:
Referred to as orthographic depth, the degree of consistency of grapheme/phoneme correspondences varies across languages from high in shallow orthographies to low in deep orthographies. The present study investigates the impact of orthographic depth on reading route by analyzing evoked potentials to words in a deep (French) and shallow (German) language presented to highly proficient bilinguals. ERP analyses to German and French words revealed significant topographic modulations 240-280ms post-stimulus onset, indicative of distinct brain networks engaged in reading over this time window. Source estimations revealed that these effects stemmed from modulations of left insular, inferior frontal and dorsolateral regions (German>French) previously associated to phonological processing. Our results show that reading in a shallow language was associated to a stronger engagement of phonological pathways than reading in a deep language. Thus, the lexical pathways favored in word reading are reinforced by phonological networks more strongly in the shallow than deep orthography.
Resumo:
BACKGROUND AND OBJECTIVES: The biased interpretation of ambiguous social situations is considered a maintaining factor of Social Anxiety Disorder (SAD). Studies on the modification of interpretation bias have shown promising results in laboratory settings. The present study aims at pilot-testing an Internet-based training that targets interpretation and judgmental bias. METHOD: Thirty-nine individuals meeting diagnostic criteria for SAD participated in an 8-week, unguided program. Participants were presented with ambiguous social situations, were asked to choose between neutral, positive, and negative interpretations, and were required to evaluate costs of potential negative outcomes. Participants received elaborate automated feedback on their interpretations and judgments. RESULTS: There was a pre-to-post-reduction of the targeted cognitive processing biases (d = 0.57-0.77) and of social anxiety symptoms (d = 0.87). Furthermore, results showed changes in depression and general psychopathology (d = 0.47-0.75). Decreases in cognitive biases and symptom changes did not correlate. The results held stable accounting for drop-outs (26%) and over a 6-week follow-up period. Forty-five percent of the completer sample showed clinical significant change and almost half of the participants (48%) no longer met diagnostic criteria for SAD. LIMITATIONS: As the study lacks a control group, results lend only preliminary support to the efficacy of the intervention. Furthermore, the mechanism of change remained unclear. CONCLUSION: First results promise a beneficial effect of the program for SAD patients. The treatment proved to be feasible and acceptable. Future research should evaluate the intervention in a randomized-controlled setting.
Resumo:
Behavioural tests to assess affective states are widely used in human research and have recently been extended to animals. These tests assume that affective state influences cognitive processing, and that animals in a negative affective state interpret ambiguous information as expecting a negative outcome (displaying a negative cognitive bias). Most of these tests however, require long discrimination training. The aim of the study was to validate an exploration based cognitive bias test, using two different handling methods, as previous studies have shown that standard tail handling of mice increases physiological and behavioural measures of anxiety compared to cupped handling. Therefore, we hypothesised that tail handled mice would display a negative cognitive bias. We handled 28 female CD-1 mice for 16 weeks using either tail handling or cupped handling. The mice were then trained in an eight arm radial maze, where two adjacent arms predicted a positive outcome (darkness and food), while the two opposite arms predicted a negative outcome (no food, white noise and light). After six days of training, the mice were also given access to the four previously unavailable intermediate ambiguous arms of the radial maze and tested for cognitive bias. We were unable to validate this test, as mice from both handling groups displayed a similar pattern of exploration. Furthermore, we examined whether maze exploration is affected by the expression of stereotypic behaviour in the home cage. Mice with higher levels of stereotypic behaviour spent more time in positive arms and avoided ambiguous arms, displaying a negative cognitive bias. While this test needs further validation, our results indicate that it may allow the assessment of affective state in mice with minimal training— a major confound in current cognitive bias paradigms.
Resumo:
A three-level satellite to ground monitoring scheme for conservation easement monitoring has been implemented in which high-resolution imagery serves as an intermediate step for inspecting high priority sites. A digital vertical aerial camera system was developed to fulfill the need for an economical source of imagery for this intermediate step. A method for attaching the camera system to small aircraft was designed, and the camera system was calibrated and tested. To ensure that the images obtained were of suitable quality for use in Level 2 inspections, rectified imagery was required to provide positional accuracy of 5 meters or less to be comparable to current commercially available high-resolution satellite imagery. Focal length calibration was performed to discover the infinity focal length at two lens settings (24mm and 35mm) with a precision of O.1mm. Known focal length is required for creation of navigation points representing locations to be photographed (waypoints). Photographing an object of known size at distances on a test range allowed estimates of focal lengths of 25.lmm and 35.4mm for the 24mm and 35mm lens settings, respectively. Constants required for distortion removal procedures were obtained using analytical plumb-line calibration procedures for both lens settings, with mild distortion at the 24mm setting and virtually no distortion found at the 35mm setting. The system was designed to operate in a series of stages: mission planning, mission execution, and post-mission processing. During mission planning, waypoints were created using custom tools in geographic information system (GIs) software. During mission execution, the camera is connected to a laptop computer with a global positioning system (GPS) receiver attached. Customized mobile GIs software accepts position information from the GPS receiver, provides information for navigation, and automatically triggers the camera upon reaching the desired location. Post-mission processing (rectification) of imagery for removal of lens distortion effects, correction of imagery for horizontal displacement due to terrain variations (relief displacement), and relating the images to ground coordinates were performed with no more than a second-order polynomial warping function. Accuracy testing was performed to verify the positional accuracy capabilities of the system in an ideal-case scenario as well as a real-world case. Using many welldistributed and highly accurate control points on flat terrain, the rectified images yielded median positional accuracy of 0.3 meters. Imagery captured over commercial forestland with varying terrain in eastern Maine, rectified to digital orthophoto quadrangles, yielded median positional accuracies of 2.3 meters with accuracies of 3.1 meters or better in 75 percent of measurements made. These accuracies were well within performance requirements. The images from the digital camera system are of high quality, displaying significant detail at common flying heights. At common flying heights the ground resolution of the camera system ranges between 0.07 meters and 0.67 meters per pixel, satisfying the requirement that imagery be of comparable resolution to current highresolution satellite imagery. Due to the high resolution of the imagery, the positional accuracy attainable, and the convenience with which it is operated, the digital aerial camera system developed is a potentially cost-effective solution for use in the intermediate step of a satellite to ground conservation easement monitoring scheme.
Resumo:
The goal of this study was to investigate the properties of human acid (alpha)-glucosidase with respect to: (i) the molecular heterogeneity of the enzyme and (ii) the synthesis, post-translational modification, and transport of acid (alpha)-glucosidase in human fibroblasts.^ The initial phase of these investigations involved the purification of acid (alpha)-glucosidase from the human liver. Human hepatic acid (alpha)-glucosidase was characterized by isoelectric focusing and native and sodium dodecyl sulfate-polyacrylamide gel electrophoresis (SDS-PAGE). Four distinct charge forms of hepatic acid (alpha)-glucosidase were separated by chromatofocusing and characterized individually. Charge heterogeneity was demonstrated to result from differences in the polypeptide components of each charge form.^ The second aspect of this research focused on the biosynthesis and the intracellular processing and transport of acid (alpha)-glucosidase in human fibroblasts. These experiments were accomplished by immune precipitation of the biosynthetic intermediates of acid (alpha)-glucosidase from radioactively labeled fibroblasts with polyclonal and monoclonal antibodies raised against human hepatic acid (alpha)-glucosidase. The immune precipitated biosynthetic forms of acid (alpha)-glucosidase were analyzed by SDS-PAGE and autoradiography. The pulse-chase experiments demonstrated the existence of several transient, high molecular weight precursors of acid (alpha)-glucosidase. These precursors were demonstrated to be intermediates of acid (alpha)-glucosidase at different stages of transport and processing in the Golgi apparatus. Other experiments were performed to examine the role of co-translational glycosylation of acid (alpha)-glucosidase in the transport and processing of precursors of this enzyme.^ A specific immunological assay for detecting acid (alpha)-glucosidase was developed using the monoclonal antibodies described above. This method was modified to increase the sensitivity of the assay by utilization of the biotin-avidin amplification system. This method was demonstrated to be more sensitive for detecting human acid (alpha)-glucosidase than the currently used biochemical assay for acid (alpha)-glucosidase activity. It was also demonstrated that the biotin-avidin immunoassay could discriminate between normal and acid (alpha)-glucosidase deficient fibroblasts, thus providing an alternative approach to detecting this inborn error in metabolism. (Abstract shortened with permission of author.) ^
Resumo:
While India's state-owned enterprises are widely believed to be inefficient, there is a dearth of studies that document such inefficiency on any rigorous basis. Yet, since improvement in firm efficiency is one of the basic objectives of privatization, it is important to assess whether efficiency is indeed lower in the public sector than in the private sector. This paper compares the performance of state-owned enterprises with those of private sector firms in respect of technical efficiency. The comparison is made in eight different sectors over the period 1991-92 to 1998-99. We measure technical efficiency using the method of Data Envelopment Analysis. Judging by the average levels of technical efficiency, no conclusive evidence of superior performance on the part of the private sector is found.
Resumo:
Various airborne aldehydes and ketones (i.e., airborne carbonyls) present in outdoor, indoor, and personal air pose a risk to human health at present environmental concentrations. To date, there is no adequate, simple-to-use sampler for monitoring carbonyls at parts per billion concentrations in personal air. The Passive Aldehydes and Ketones Sampler (PAKS) originally developed for this purpose has been found to be unreliable in a number of relatively recent field studies. The PAKS method uses dansylhydrazine, DNSH, as the derivatization agent to produce aldehyde derivatives that are analyzed by HPLC with fluorescence detection. The reasons for the poor performance of the PAKS are not known but it is hypothesized that the chemical derivatization conditions and reaction kinetics combined with a relatively low sampling rate may play a role. This study evaluated the effect of absorption and emission wavelengths, pH of the DNSH coating solution, extraction solvent, and time post-extraction for the yield and stability of formaldehyde, acetaldehyde, and acrolein DNSH derivatives. The results suggest that the optimum conditions for the analysis of DNSHydrazones are the following. The excitation and emission wavelengths for HPLC analysis should be at 250nm and 500nm, respectively. The optimal pH of the coating solution appears to be pH 2 because it improves the formation of di-derivatized acrolein DNSHydrazones without affecting the response of the derivatives of the formaldehyde and acetaldehyde derivatives. Acetonitrile is the preferable extraction solvent while the optimal time to analyze the aldehyde derivatives is 72 hours post-extraction. ^
Resumo:
Purpose. Fluorophotometry is a well validated method for assessing corneal permeability in human subjects. However, with the growing importance of basic science animal research in ophthalmology, fluorophotometry’s use in animals must be further evaluated. The purpose of this study was to evaluate corneal epithelial permeability following desiccating stress using the modified Fluorotron Master™. ^ Methods. Corneal permeability was evaluated prior to and after subjecting 6-8 week old C57BL/6 mice to experimental dry eye (EDE) for 2 and 5 days (n=9/time point). Untreated mice served as controls. Ten microliters of 0.001% sodium fluorescein (NaF) were instilled topically into each mouse’s left eye to create an eye bath, and left to permeate for 3 minutes. The eye bath was followed by a generous wash with Buffered Saline Solution (BSS) and alignment with the Fluorotron Master™. Seven corneal scans using the Fluorotron Master were performed during 15 minutes (1 st post-wash scans), followed by a second wash using BSS and another set of five corneal scans (2nd post-wash scans) during the next 15 minutes. Corneal permeability was calculated using data calculated with the FM™ Mouse software. ^ Results. When comparing the difference between the Post wash #1 scans within the group and the Post wash #2 scans within the group using a repeated measurement design, there was a statistical difference in the corneal fluorescein permeability of the Post-wash #1 scans after 5 days (1160.21±108.26 vs. 1000.47±75.56 ng/mL, P<0.016 for UT-5 day comparison 8 [0.008]), but not after only 2 days of EDE compared to Untreated mice (1115.64±118.94 vs. 1000.47±75.56 ng/mL, P>0.016 for UT-2 day comparison [0.050]). There was no statistical difference between the 2 day and 5 day Post wash #1 scans (P=.299). The Post-wash #2 scans demonstrated that EDE caused a significant NaF retention at both 2 and 5 days of EDE compared to baseline, untreated controls (1017.92±116.25, 1015.40±120.68 vs. 528.22±127.85 ng/mL, P<0.05 [0.0001 for both]). There was no statistical difference between the 2 day and 5 day Post wash #2 scans (P=.503). The comparison between the Untreated post wash #1 with untreated post wash #2 scans using a Paired T-test showed a significant difference between the two sets of scans (P=0.000). There is also a significant difference between the 2 day comparison and the 5 day comparison (P values = 0.010 and 0.002, respectively). ^ Conclusion. Desiccating stress increases permeability of the corneal epithelium to NaF, and increases NaF retention in the corneal stroma. The Fluorotron Master is a useful and sensitive tool to evaluate corneal permeability in murine dry eye, and will be a useful tool to evaluate the effectiveness of dry eye treatments in animal-model drug trials.^
Resumo:
The Centers for Disease Control estimates that foodborne diseases cause approximately 76 million illnesses, 325,000 hospitalizations, and 5,000 deaths in the United States each year. The American public is becoming more health conscious and there has been an increase in the dietary intake of fresh fruits and vegetables. Affluence and demand for convenience has allowed consumers to opt for pre-processed packaged fresh fruits and vegetables. These pre-processed foods are considered Ready-to-Eat. They have many of the advantages of fresh produce without the inconvenience of processing at home. After seeing a decline in food-related illnesses between 1996 and 2004, due to an improvement in meat and poultry safety, tainted produce has tilted the numbers back. This has resulted in none of the Healthy People 2010 targets for food-related illness reduction being reached. Irradiation has been shown to be effective in eliminating many of the foodborne pathogens. The application of irradiation as a food safety treatment has been widely endorsed by many of the major associations involved with food safety and public health. Despite these endorsements there has been very little use of this technology to date for reducing the disease burden associated with the consumption of these products. A review of the available literature since the passage of the 1996 Food Quality Protection Act was conducted on the barriers to implementing irradiation as a food safety process for fresh fruits and vegetables. The impediments to adopting widespread utilization of irradiation food processing as a food safety measure involve a complex array of legislative, regulatory, industry, and consumer issues. The FDA’s approval process limits the expansion of the list of foods approved for the application of irradiation as a food safety process. There is also a lack of capacity within the industry to meet the needs of a geographically dispersed industry.^
Resumo:
Clinical text understanding (CTU) is of interest to health informatics because critical clinical information frequently represented as unconstrained text in electronic health records are extensively used by human experts to guide clinical practice, decision making, and to document delivery of care, but are largely unusable by information systems for queries and computations. Recent initiatives advocating for translational research call for generation of technologies that can integrate structured clinical data with unstructured data, provide a unified interface to all data, and contextualize clinical information for reuse in multidisciplinary and collaborative environment envisioned by CTSA program. This implies that technologies for the processing and interpretation of clinical text should be evaluated not only in terms of their validity and reliability in their intended environment, but also in light of their interoperability, and ability to support information integration and contextualization in a distributed and dynamic environment. This vision adds a new layer of information representation requirements that needs to be accounted for when conceptualizing implementation or acquisition of clinical text processing tools and technologies for multidisciplinary research. On the other hand, electronic health records frequently contain unconstrained clinical text with high variability in use of terms and documentation practices, and without commitmentto grammatical or syntactic structure of the language (e.g. Triage notes, physician and nurse notes, chief complaints, etc). This hinders performance of natural language processing technologies which typically rely heavily on the syntax of language and grammatical structure of the text. This document introduces our method to transform unconstrained clinical text found in electronic health information systems to a formal (computationally understandable) representation that is suitable for querying, integration, contextualization and reuse, and is resilient to the grammatical and syntactic irregularities of the clinical text. We present our design rationale, method, and results of evaluation in processing chief complaints and triage notes from 8 different emergency departments in Houston Texas. At the end, we will discuss significance of our contribution in enabling use of clinical text in a practical bio-surveillance setting.
Resumo:
This chapter attempts to identify whether product differentiation or geographical differentiation is the main source of profit for firms in developing economies by employing a simple idea from the recently developed method of empirical industrial organization. Theoretically, location choice and product choice have been considered as analogues in differentiation, but in the real world, which of these strategies is chosen will result in an immense difference in firm behavior and in the development process of the industry. Development of the technique of empirical industrial organization enabled us to identify market outcomes with endogeneity. A typical case is the market outcome with differentiation, where price or product choice is endogenously determined. Our original survey contains data on market location, differences in product types, and price. The results show that product differentiation rather than geographical differentiation mitigates pressure on price competition, but 70 per cent secures geographical monopoly.
Resumo:
Advanced liver surgery requires a precise pre-operative planning, where liver segmentation and remnant liver volume are key elements to avoid post-operative liver failure. In that context, level-set algorithms have achieved better results than others, especially with altered liver parenchyma or in cases with previous surgery. In order to improve functional liver parenchyma volume measurements, in this work we propose two strategies to enhance previous level-set algorithms: an optimal multi-resolution strategy with fine details correction and adaptive curvature, as well as an additional semiautomatic step imposing local curvature constraints. Results show more accurate segmentations, especially in elongated structures, detecting internal lesions and avoiding leakages to close structures
Resumo:
During sentence processing there is a preference to treat the first noun phrase found as the subject and agent, unless marked the other way. This preference would lead to a conflict in thematic role assignment when the syntactic structure conforms to a non-canonical object-before-subject pattern. Left perisylvian and fronto-parietal brain networks have been found to be engaged by increased computational demands during sentence comprehension, while event-reated brain potentials have been used to study the on-line manifestation of these demands. However, evidence regarding the spatiotemporal organization of brain networks in this domain is scarce. In the current study we used Magnetoencephalography to track spatio-temporally brain activity while Spanish speakers were reading subject- and object-first cleft sentences. Both kinds of sentences remained ambiguous between a subject-first or an object-first interpretation up to the appearance of the second argument. Results show the time-modulation of a frontal network at the disambiguation point of object-first sentences. Moreover, the time windows where these effects took place have been previously related to thematic role integration (300–500 ms) and to sentence reanalysis and resolution of conflicts during processing (beyond 500 ms post-stimulus). These results point to frontal cognitive control as a putative key mechanism which may operate when a revision of the sentence structure and meaning is necessary
Resumo:
Linear regression is a technique widely used in digital signal processing. It consists on finding the linear function that better fits a given set of samples. This paper proposes different hardware architectures for the implementation of the linear regression method on FPGAs, specially targeting area restrictive systems. It saves area at the cost of constraining the lengths of the input signal to some fixed values. We have implemented the proposed scheme in an Automatic Modulation Classifier, meeting the hard real-time constraints this kind of systems have.