958 resultados para Acquired Flatfoot


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Marine Fishery Reserves (MFRs) are being adopted, in part, as a strategy to replenish depleted fish stocks and serve as a source for recruits to adjacent fisheries. By necessity, their design must consider the biological parameters of the species under consideration to ensure that the spawning stock is conserved while simultaneously providing propagules for dispersal. We describe how acoustic telemetry can be employed to design effective MFRs by elucidating important life-history parameters of the species under consideration, including home range, and ecological preferences, including habitat utilization. We then designed a reserve based on these parameters using data from two acoustic telemetry studies that examined two closely-linked subpopulations of queen conch (Strombus gigas) at Conch Reef in the Florida Keys. The union of the home ranges of the individual conch (aggregation home range: AgHR) within each subpopulation was used to construct a shape delineating the area within which a conch would be located with a high probability. Together with habitat utilization information acquired during both the spawning and non-spawning seasons, as well as landscape features (i.e., corridors), we designed a 66.5 ha MFR to conserve the conch population. Consideration was also given for further expansion of the population into suitable habitats.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As part of a multibeam and side scan sonar (SSS) benthic survey of the Marine Conservation District (MCD) south of St. Thomas, USVI and the seasonal closed areas in St. Croix—Lang Bank (LB) for red hind (Epinephelus guttatus) and the Mutton Snapper (MS) (Lutjanus analis) area—we extracted signals from water column targets that represent individual and aggregated fish over various benthic habitats encountered in the SSS imagery. The survey covered a total of 18 km2 throughout the federal jurisdiction fishery management areas. The complementary set of 28 habitat classification digital maps covered a total of 5,462.3 ha; MCDW (West) accounted for 45% of that area, and MCDE (East) 26%, LB 17%, and MS the remaining 13%. With the exception of MS, corals and gorgonians on consolidated habitats were significantly more abundant than submerged aquatic vegetation (SAV) on unconsolidated sediments or unconsolidated sediments. Continuous coral habitat was the most abundant consolidated habitat for both MCDW and MCDE (41% and 43% respectively). Consolidated habitats in LB and MS predominantly consisted of gorgonian plain habitat with 95% and 83% respectively. Coral limestone habitat was more abundant than coral patch habitat; it was found near the shelf break in MS, MCDW, and MCDE. Coral limestone and coral patch habitats only covered LB minimally. The high spatial resolution (0.15 m) of the acquired imagery allowed the detection of differing fish aggregation (FA) types. The largest FA densities were located at MCDW and MCDE over coral communities that occupy up to 70% of the bottom cover. Counts of unidentified swimming objects (USOs), likely representing individual fish, were similar among locations and occurred primarily over sand and shelf edge areas. Fish aggregation school sizes were significantly smaller at MS than the other three locations (MCDW, MCDE, and LB). This study shows the advantages of utilizing SSS in determining fish distributions and density.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[EN] Background: Polymerase Chain Reaction (PCR) and Restriction Fragment Length Polymorphism of PCR products (PCR-RFLP) are extensively used molecular biology techniques. An exercise for the design and simulation of PCR and PCR-RFLP experiments will be a useful educational tool. Findings: An online PCR and PCR-RFLP exercise has been create that requires users to find the target genes,compare them, design primers, search for restriction endonucleases, and finally to simulate the experiment. Each user of the service is randomly assigned a gene from Escherichia coli; to complete the exercise, users must design an experiment capable of distinguishing among E. coli strains. By applying the experimental procedure to all completely sequenced E. coli, a basic understanding of strain comparison and clustering can also be acquired. Comparison of results obtained in different experiments is also very instructive. Conclusions: The exercise is freely available at http://insilico.ehu.es/edu.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To improve the cod stocks in the Baltic Sea, a number of regulations have recently been established by the International Baltic Sea Fisheries Commission (IBSFC) and the European Commission. According to these, fishermen are obliged to use nets with escape windows (BACOMA nets) with a mesh size of the escape window of 120 mm until end of September 2003. These nets however, retain only fish much larger than the legal minimum landing size would al-low. Due to the present stock structure only few of such large fish are however existent. As a consequence fishermen use a legal alternative net. This is a conventional trawl with a cod-end of 130 mm diamond-shaped meshes (IBSFC-rules of 1st April 2002), to be increased to 140 mm on 1st September 2003, according to the mentioned IBSFC-rule. Due legal alterations of the net by the fishermen (e.g. use of extra stiff net material) these nets have acquired extremely low selective properties, i. e. they catch very small fish and produce great amounts of discards. Due to the increase of the minimum landing size from 35 to 38 cm for cod in the Baltic, the amount of discards has even increased since the beginning of 2003. Experiments have now been carried out with the BACOMAnet on German and Swedish commercial and research vessels since arguments were brought forward that the BACOMA net was not yet sufficiently tested on commercial vessels. The results of all experiments conducted so far, are compiled and evaluated here. As a result of the Swedish, Danish and German initiative and research the European Commission reacted upon this in June 2003 and rejected the increase of the diamond-meshed non-BACOMA net from 130 mm to 140mm in September 2003. To protect the cod stocks in the Baltic Sea more effectively the use of traditional diamond meshed cod-ends with-out escape window are prohibited in community waters without derogation, becoming effective 1st of September 2003. To enable more effective and simplified control of the bottom trawl fishery in the Baltic Sea the principle of a ”One-Net-Rule“ is enforced. This is going to be the BACOMA net, with the meshes of the escape window being 110 mm for the time being. The description of the BACOMA net as given in the IBSFC-rules no.10 (revision of the 28th session, Berlin 2002) concentrates on the cod-end and the escape window but only to a less extent on the design and mesh-composition of the remaining parts of the net, such as belly and funnel and many details. Thus, the present description is not complete and leaves, according to fishermen, ample opportunity for manipulation. An initiative has been started in Germany with joint effort from scientists and the fishery to better describe the entire net and to produce a proposal for a more comprehensive description, leaving less space for manipulation. A proposal in this direction is given here and shall be seen as a starting point for a discussion and development towards an internationally uniform net, which is agreed amongst the fishery, scientists and politicians. The Baltic Sea fishery is invited to comment on this proposal, and recommendations for further improvement and specifications are welcomed. Once the design is agreed by the Baltic Fishermen Association, it shall be proposed to the IBSFC and European Commission via the Baltic Fishermen Association.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Daily sea surface temperatures have been acquired at the Hopkins Marine Station in Pacific Grove, California since January 20, 1919.This time series is one of the longest oceanographic records along the U.S. west coast. Because of its length it is well-suited for studying climate-related and oceanic variability on interannual, decadal, and interdecadal time scales. The record, however, is not homogeneous, has numerous gaps, contains possible outliers, and the observations were not always collected at the same time each day. Because of these problems we have undertaken the task of reconstructing this long and unique series. We describe the steps that were taken and the methods that were used in this reconstruction. Although the methods employed are basic, we believe that they are consistent with the quality of the data. The reconstructed record has values at every time point, original, or estimated, and has been adjusted for time-of-day variations where this information was available. Possible outliers have also been examined and replaced where their credibility could not be established. Many of the studies that have employed the Hopkins time series have not discussed the issue of data quality and how these problems were addressed. Because of growing interest in this record, it is important that a single, well-documented version be adopted, so that the results of future analyses can be directly compared. Although additional work may be done to further improve the quality of this record, it is now available via the internet. [PDF contains 48 pages]

Relevância:

10.00% 10.00%

Publicador:

Resumo:

2nd International Conference on Education and New Learning Technologies

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new set of experimental data of subcooled pool boiling on a thin wire in rnicrogravity aboard the 22nd Chinese recoverable satellite is reported in the present paper. The temperature-control led heating method is used. The results of the experiments in normal gravity before and after the flight experiment are also presented, and compared with those in microgravity. The working fluid is degassed R113 at 0.1 MPa and subcooled by 26 degrees C nominally. A thin platinum wire of 60 mu m in diameter and 30 mm in length is simultaneously used as heater and thermometer. It is found that the heat transfer of nucleate pool boiling is slightly enhanced in microgravity comparing with those in normal gravity. It is also found that the correlation of Lienhard and Dhir can predict the CHF with good agreement, although the range of the dimensionless radius is extended by three or more decades above the originally set limit. Three critical bubble diameters are observed in microgravity, which divide the observed vapor bubbles into four regimes with different sizes. Considering the Marangoni effect, a qualitative model is proposed to reveal the mechanism underlying the bubble departure processes, and a quantitative agreement can also be acquired.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Interleukin-2 is one of the lymphokines secreted by T helper type 1 cells upon activation mediated by T-cell receptor (TCR) and accessory molecules. The ability to express IL-2 is correlated with T-lineage commitment and is regulated during T cell development and differentiation. Understanding the molecular mechanism of how IL-2 gene inducibility is controlled at each transition and each differentiation process of T-cell development is to understand one aspect of T-cell development. In the present study, we first attempted to elucidate the molecular basis for the developmental changes of IL-2 gene inducibility. We showed that IL-2 gene inducibility is acquired early in immature CD4- CD8-TCR- thymocytes prior to TCR gene rearrangement. Similar to mature T cells, a complete set of transcription factors can be induced at this early stage to activate IL-2 gene expression. The progression of these cells to cortical CD4^+CD8^+TCR^(1o) cells is accompanied by the loss of IL-2 gene inducibility. We demonstrated that DNA binding activities of two transcription factors AP-1 and NF-AT are reduced in cells at this stage. Further, the loss of factor binding, especially AP-1, is attributable to the reduced ability to activate expression of three potential components of AP-1 and NF-AT, including c-Fos, FosB, and Fra-2. We next examined the interaction of transcription factors and the IL-2 promoter in vivo by using the EL4 T cell line and two non-T cell lines. We showed an all-or-none phenomenon regarding the factor-DNA interaction, i.e., in activated T cells, the IL-2 promoter is occupied by sequence-specific transcription factors when all the transcription factors are available; in resting T cells or non-T cells, no specific protein-DNA interaction is observed when only a subset of factors are present in the nuclei. Purposefully reducing a particular set of factor binding activities in stimulated T cells using pharmacological agents cyclosporin A or forskolin also abolished all interactions. The results suggest that a combinatorial and coordinated protein-DNA interaction is required for IL-2 gene activation. The thymocyte experiments clearly illustrated that multiple transcription factors are regulated during intrathymic T-cell development, and this regulation in tum controls the inducibility of the lineage-specific IL-2 gene. The in vivo study of protein-DNA interaction stressed the combinatorial action of transcription factors to stably occupy the IL-2 promoter and to initiate its transcription, and provided a molecular mechanism for changes in IL-2 gene inducibility in T cells undergoing integration of multiple environmental signals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Humans are able of distinguishing more than 5000 visual categories even in complex environments using a variety of different visual systems all working in tandem. We seem to be capable of distinguishing thousands of different odors as well. In the machine learning community, many commonly used multi-class classifiers do not scale well to such large numbers of categories. This thesis demonstrates a method of automatically creating application-specific taxonomies to aid in scaling classification algorithms to more than 100 cate- gories using both visual and olfactory data. The visual data consists of images collected online and pollen slides scanned under a microscope. The olfactory data was acquired by constructing a small portable sniffing apparatus which draws air over 10 carbon black polymer composite sensors. We investigate performance when classifying 256 visual categories, 8 or more species of pollen and 130 olfactory categories sampled from common household items and a standardized scratch-and-sniff test. Taxonomies are employed in a divide-and-conquer classification framework which improves classification time while allowing the end user to trade performance for specificity as needed. Before classification can even take place, the pollen counter and electronic nose must filter out a high volume of background “clutter” to detect the categories of interest. In the case of pollen this is done with an efficient cascade of classifiers that rule out most non-pollen before invoking slower multi-class classifiers. In the case of the electronic nose, much of the extraneous noise encountered in outdoor environments can be filtered using a sniffing strategy which preferentially samples the visensor response at frequencies that are relatively immune to background contributions from ambient water vapor. This combination of efficient background rejection with scalable classification algorithms is tested in detail for three separate projects: 1) the Caltech-256 Image Dataset, 2) the Caltech Automated Pollen Identification and Counting System (CAPICS) and 3) a portable electronic nose specially constructed for outdoor use.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Waking up from a dreamless sleep, I open my eyes, recognize my wife’s face and am filled with joy. In this thesis, I used functional Magnetic Resonance Imaging (fMRI) to gain insights into the mechanisms involved in this seemingly simple daily occurrence, which poses at least three great challenges to neuroscience: how does conscious experience arise from the activity of the brain? How does the brain process visual input to the point of recognizing individual faces? How does the brain store semantic knowledge about people that we know? To start tackling the first question, I studied the neural correlates of unconscious processing of invisible faces. I was unable to image significant activations related to the processing of completely invisible faces, despite existing reports in the literature. I thus moved on to the next question and studied how recognition of a familiar person was achieved in the brain; I focused on finding invariant representations of person identity – representations that would be activated any time we think of a familiar person, read their name, see their picture, hear them talk, etc. There again, I could not find significant evidence for such representations with fMRI, even in regions where they had previously been found with single unit recordings in human patients (the Jennifer Aniston neurons). Faced with these null outcomes, the scope of my investigations eventually turned back towards the technique that I had been using, fMRI, and the recently praised analytical tools that I had been trusting, Multivariate Pattern Analysis. After a mostly disappointing attempt at replicating a strong single unit finding of a categorical response to animals in the right human amygdala with fMRI, I put fMRI decoding to an ultimate test with a unique dataset acquired in the macaque monkey. There I showed a dissociation between the ability of fMRI to pick up face viewpoint information and its inability to pick up face identity information, which I mostly traced back to the poor clustering of identity selective units. Though fMRI decoding is a powerful new analytical tool, it does not rid fMRI of its inherent limitations as a hemodynamics-based measure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Commercially available software packages for IBM PC-compatibles are evaluated to use for data acquisition and processing work. Moss Landing Marine Laboratories (MLML) acquired computers since 1978 to use on shipboard data acquisition (Le. CTD, radiometric, etc.) and data processing. First Hewlett-Packard desktops were used then a transition to the DEC VAXstations, with software developed mostly by the author and others at MLML (Broenkow and Reaves, 1993; Feinholz and Broenkow, 1993; Broenkow et al, 1993). IBM PC were at first very slow and limited in available software, so they were not used in the early days. Improved technology such as higher speed microprocessors and a wide range of commercially available software made use of PC more reasonable today. MLML is making a transition towards using the PC for data acquisition and processing. Advantages are portability and available outside support.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent observations of the temperature anisotropies of the cosmic microwave background (CMB) favor an inflationary paradigm in which the scale factor of the universe inflated by many orders of magnitude at some very early time. Such a scenario would produce the observed large-scale isotropy and homogeneity of the universe, as well as the scale-invariant perturbations responsible for the observed (10 parts per million) anisotropies in the CMB. An inflationary epoch is also theorized to produce a background of gravitational waves (or tensor perturbations), the effects of which can be observed in the polarization of the CMB. The E-mode (or parity even) polarization of the CMB, which is produced by scalar perturbations, has now been measured with high significance. Con- trastingly, today the B-mode (or parity odd) polarization, which is sourced by tensor perturbations, has yet to be observed. A detection of the B-mode polarization of the CMB would provide strong evidence for an inflationary epoch early in the universe’s history.

In this work, we explore experimental techniques and analysis methods used to probe the B- mode polarization of the CMB. These experimental techniques have been used to build the Bicep2 telescope, which was deployed to the South Pole in 2009. After three years of observations, Bicep2 has acquired one of the deepest observations of the degree-scale polarization of the CMB to date. Similarly, this work describes analysis methods developed for the Bicep1 three-year data analysis, which includes the full data set acquired by Bicep1. This analysis has produced the tightest constraint on the B-mode polarization of the CMB to date, corresponding to a tensor-to-scalar ratio estimate of r = 0.04±0.32, or a Bayesian 95% credible interval of r < 0.70. These analysis methods, in addition to producing this new constraint, are directly applicable to future analyses of Bicep2 data. Taken together, the experimental techniques and analysis methods described herein promise to open a new observational window into the inflationary epoch and the initial conditions of our universe.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Optical microscopy has become an indispensable tool for biological researches since its invention, mostly owing to its sub-cellular spatial resolutions, non-invasiveness, instrumental simplicity, and the intuitive observations it provides. Nonetheless, obtaining reliable, quantitative spatial information from conventional wide-field optical microscopy is not always intuitive as it appears to be. This is because in the acquired images of optical microscopy the information about out-of-focus regions is spatially blurred and mixed with in-focus information. In other words, conventional wide-field optical microscopy transforms the three-dimensional spatial information, or volumetric information about the objects into a two-dimensional form in each acquired image, and therefore distorts the spatial information about the object. Several fluorescence holography-based methods have demonstrated the ability to obtain three-dimensional information about the objects, but these methods generally rely on decomposing stereoscopic visualizations to extract volumetric information and are unable to resolve complex 3-dimensional structures such as a multi-layer sphere.

The concept of optical-sectioning techniques, on the other hand, is to detect only two-dimensional information about an object at each acquisition. Specifically, each image obtained by optical-sectioning techniques contains mainly the information about an optically thin layer inside the object, as if only a thin histological section is being observed at a time. Using such a methodology, obtaining undistorted volumetric information about the object simply requires taking images of the object at sequential depths.

Among existing methods of obtaining volumetric information, the practicability of optical sectioning has made it the most commonly used and most powerful one in biological science. However, when applied to imaging living biological systems, conventional single-point-scanning optical-sectioning techniques often result in certain degrees of photo-damages because of the high focal intensity at the scanning point. In order to overcome such an issue, several wide-field optical-sectioning techniques have been proposed and demonstrated, although not without introducing new limitations and compromises such as low signal-to-background ratios and reduced axial resolutions. As a result, single-point-scanning optical-sectioning techniques remain the most widely used instrumentations for volumetric imaging of living biological systems to date.

In order to develop wide-field optical-sectioning techniques that has equivalent optical performance as single-point-scanning ones, this thesis first introduces the mechanisms and limitations of existing wide-field optical-sectioning techniques, and then brings in our innovations that aim to overcome these limitations. We demonstrate, theoretically and experimentally, that our proposed wide-field optical-sectioning techniques can achieve diffraction-limited optical sectioning, low out-of-focus excitation and high-frame-rate imaging in living biological systems. In addition to such imaging capabilities, our proposed techniques can be instrumentally simple and economic, and are straightforward for implementation on conventional wide-field microscopes. These advantages together show the potential of our innovations to be widely used for high-speed, volumetric fluorescence imaging of living biological systems.