954 resultados para to-sample
Resumo:
Past research has demonstrated emergent conditional relations using a go/no-go procedure with pairs of figures displayed side-by-side on a computer screen. The present Study sought to extend applications Of this procedure. In Experiment, 1, we evaluated whether emergent conditional relations Could be demonstrated when two-component stimuli were displayed in figure-ground relationships-abstract figures displayed on backgrounds of different colors. Five normal)), capable adults participated. During training, each two-component stimulus Was presented successively. Responses emitted in the presence of some Stimulus pairs (A1B1, A2B2, A3B3, B1C1, B2C2 and B3C3) were reinforced, whereas responses emitted in the presence of other pairs (A1B2, A1B3, A2B1, A2B3, A3B1, A3B2, B1C2, B1C3, B2C1, B2C3, B3C1 and B3C2) were not. During tests, new configurations (AC and CA) were presented, thus emulating structurally the matching-to-sample tests employed in typical equivalence Studies. All participants showed emergent relations consistent with stimulus equivalence during testing. In Experiment 2, we systematically replicated the procedures with Stimulus compounds consisting Of four figures (A1, A2, C1 and C2) and two locations (left - B1 and right - 132). A,11 6 normally capable adults exhibited emergent stimulus-stimulus relations. Together, these experiments show that the go/no-go procedure is a potentially useful alternative for Studying emergent. conditional relations when matching-to-sample is procedurally cumbersome or impossible to use.
Resumo:
The present experiment investigated whether pigeons can show associative symmetry on a two-alternative matching to-sample procedure The procedure consisted of a within subject sequence of training and testing with reinforcement and It provided (a) exemplars of symmetrical responding and (b) all prerequisite discriminations among test samples and comparisons After pigeons had learned two arbitrary matching tasks (A B and C D) they were given a reinforced symmetry test for half of the baseline relations (B1-A1 and D1-C1) To control for the effects of reinforcement during testing two novel nonsymmetrical responses were concurrently reinforced using the other baseline stimuli (D2-A2 and B2-C2) Pigeons matched at chance on both types of relations thus indicating no evidence for symmetry These symmetrical and nonsymmetrical relations were then directly trained in order to provide exemplars of symmetry and all prerequisite discriminations for a second test The symmetrical test relations were now B2-A2 and D2-C2 and the nonsymmetrical relations were D1-A1 and B1-C1 On this test 1 pigeon showed clear evidence of symmetry 2 pigeons showed weak evidence and 1 pigeon showed no evidence The previous training of all prerequisite discriminations among stimuli and the within subject control for testing with reinforcement seem to have set favorable conditions for the emergence of symmetry in nonhumans However the variability across subjects shows that methodological variables still remain to be controlled
Resumo:
Restricted stimulus control refers to discrimination learning with atypical limitations in the range of controlling stimuli or stimulus features In the study reported here 4 normally capable individuals and 10 individuals with Intellectual disabilities (ID) performed two-sample delayed matching to sample Sample stimulus observing was recorded with an eye tracking apparatus High accuracy scores indicated stimulus control by both sample stimuli for the 4 nondisabled participants and 4 participants with ID and eye tracking data showed reliable observing of all stimuli Intermediate accuracy scores indicated restricted stimulus control for the remaining 6 participants Their eye tracking data showed that errors were related to failures to observe sample stimuli and relatively brief observing durations Five of these participants were then given interventions designed to improve observing behavior For 4 participants the interventions resulted initially in elimination of observing failures increased observing durations and Increased accuracy For 2 of these participants contingencies sufficient to maintain adequate observing were not always sufficient to maintain high accuracy subsequent procedure modifications restored It however For the 5th participant initial improvements in observing were not accompanied by improved accuracy in apparent Instance of observing without attending accuracy improved only after an additional intervention that imposed contingencies on observing behavior Thus interventions that control observing behavior seem necessary but may not always be sufficient for the remediation of restricted stimulus control
Resumo:
The present study reports on a survey of the gelatinous zooplankton fauna (Cnidaria, Ctenophora and Thaliacea) from the proposed Baia da Babitonga marine protected area (southern Brazil; similar to 26 degrees S), based on collections from multiple sites over different seasons and from published literature. In order to sample both small and large gelatinous animals, plankton hauls (n = 255) and fishing trawls (n = 126) were employed. More than 20,000 organisms were studied, which, including literature data, totaled 48 species: one cubomedusa, three scyphomedusae, four siphonophores, 36 hydromedusae, two ctenophores, and two thaliaceans. Among these, the hydromedusae Cnidostoma fallax Vanhoffen and Helgicirrha sp. are recorded for the first time from the southwestern Atlantic coast and Paulinum sp. and Protiara sp. are recorded for the first time from the South Atlantic. A description of young stages of the hydromedusa Gossea brachymera Bigelow is presented and shows that Octobulbacea montehermosensis Zamponi is a junior synonym of the former. Although comprehensive local assessment of diverse taxonomic groups is still lacking, the high diversity observed herein underscores the importance of Ba a da Babitonga as a high priority site for conservation of regional marine biodiversity.
Nitric Oxide in the Exhaled Breath Condensate of Healthy Volunteers Collected With a Reusable Device
Resumo:
Background: The analysis of exhaled breath condensate (EBC) is a non-invasive technique that enables the determination of several volatile and nonvolatile substances produced in the respiratory tract, whose measurement may be useful for the diagnosis and monitoring of several respiratory diseases. Objective: The aim of this study was to produce a low-cost reusable device in order to sample exhaled breath condensate in healthy adult volunteers, and to determine the concentration of nitric oxide in the sample collected. Material and methods: The apparatus was made with a U-shaped tube of borosilicate glass. The tube was placed in a container with ice, and unidirectional respiratory valves were fitted to the distal end. Afterwards, nitric oxide was measured in the exhaled breath condensate (EBC) by chemiluminescence. Results: The total cost of the device was $120.20. EBC samples were obtained from 116 volunteers of both sexes, aged between 20 and 70. The mean volume of exhaled breath condensate collected during 10 minutes was 1.0 +/- 0.6 mL, and the mean level of nitric oxide was 12.99 +/- 14.38 mu M (median 8.72 mu M). There was no correlation between the nitric oxide levels in the exhaled breath condensate and age or gender. Conclusion: We demonstrate that it is possible to fabricate a low-cost, efficient, reusable device in order to collect and determine nitric oxide levels in EBC. We have identified no correlation between the nitric oxide levels present in the EBC obtained with this method with either age or sex. (C) 2011 SEPAR. Published by Elsevier Espana, S.L. All rights reserved.
Resumo:
OBJECTIVE: To analyze the association between noise levels present in preschool institutions and vocal disorders among educators. METHODS: Cross-sectional study conducted in 2009 with 28 teachers from three preschool institutions located in the city of Sao Paulo (Southeastern Brazil). Sound pressure levels were measured according to Brazilian Technical Standards Association, with the use of a sound level meter. The averages were classified according to the levels of comfort, discomfort, and auditory damage proposed by the Pan American Health Organization. The educators underwent voice evaluation: self-assessment with visual analogue scale, auditory perceptual evaluation using the GRBAS scale, and acoustic analysis utilizing the Praat program. To analyze the association between noise and voice evaluation, descriptive statistics and the chi-square test were employed, with significance of 10% due to sample size. RESULTS: The teachers' age ranged between 21 and 56 years. The noise average was 72.7 dB, considered as damage 2. The professionals' vocal self-assessment ranked an average of 5.1 on the scale, being considered as moderate alteration. In the auditory-perceptual assessment, 74% presented vocal alteration, especially hoarseness; of these, 52% were considered mild alterations. In the acoustic assessment the majority presented fundamental frequency below the expected level. Averages for jitter, shimmer and harmonic-noise ratio showed alterations. An association between the presence of noise between the harmonics and vocal disorders was observed. CONCLUSIONS: There is an association between presence of noise between the harmonics and vocal alteration, with high noise levels. Although most teachers presented mild voice alteration, the self-evaluation showed moderate alteration, probably due to the difficulty in projection.
Resumo:
The forest-like characteristics of agroforestry systems create a unique opportunity to combine agricultural production with biodiversity conservation in human-modified tropical landscapes. The cacao-growing region in southern Bahia, Brazil, encompasses Atlantic forest remnants and large extensions of agroforests, locally known as cabrucas, and harbors several endemic large mammals. Based on the differences between cabrucas and forests, we hypothesized that: (1) non-native and non-arboreal mammals are more frequent, whereas exclusively arboreal and hunted mammals are less frequent in cabrucas than forests; (2) the two systems differ in mammal assemblage structure, but not in species richness; and (3) mammal assemblage structure is more variable among cabrucas than forests. We used camera-traps to sample mammals in nine pairs of cabruca-forest sites. The high conservation value of agroforests was supported by the presence of species of conservation concern in cabrucas, and similar species richness and composition between forests and cabrucas. Arboreal species were less frequently recorded, however, and a non-native and a terrestrial species adapted to open environments (Cerdocyon thous) were more frequently recorded in cabrucas. Factors that may overestimate the conservation value of cabrucas are: the high proportion of total forest cover in the study landscape, the impoverishment of large mammal fauna in forest, and uncertainty about the long-term maintenance of agroforestry systems. Our results highlight the importance of agroforests and forest remnants for providing connectivity in human-modified tropical forest landscapes, and the importance of controlling hunting and dogs to increase the value of agroforestry mosaics.
Resumo:
Theoretical and empirical studies demonstrate that the total amount of forest and the size and connectivity of fragments have nonlinear effects on species survival. We tested how habitat amount and configuration affect understory bird species richness and abundance. We used mist nets (almost 34,000 net hours) to sample birds in 53 Atlantic Forest fragments in southeastern Brazil. Fragments were distributed among 3 10,800-ha landscapes. The remaining forest in these landscapes was below (10% forest cover), similar to (30%), and above (50%) the theoretical fragmentation threshold (approximately 30%) below which the effects of fragmentation should be intensified. Species-richness estimates were significantly higher (F = 3715, p = 0.00) where 50% of the forest remained, which suggests a species occurrence threshold of 30-50% forest, which is higher than usually occurs (<30%). Relations between forest cover and species richness differed depending on species sensitivity to forest conversion and fragmentation. For less sensitive species, species richness decreased as forest cover increased, whereas for highly sensitive species the opposite occurred. For sensitive species, species richness and the amount of forest cover were positively related, particularly when forest cover was 30-50%. Fragment size and connectivity were related to species richness and abundance in all landscapes, not just below the 30% threshold. Where 10% of the forest remained, fragment size was more related to species richness and abundance than connectivity. However, the relation between connectivity and species richness and abundance was stronger where 30% of the landscape was forested. Where 50% of the landscape was forested, fragment size and connectivity were both related to species richness and abundance. Our results demonstrated a rapid loss of species at relatively high levels of forest cover (30-50%). Highly sensitive species were 3-4 times more common above the 30-50% threshold than below it; however, our results do not support a unique fragmentation threshold.
Resumo:
Several activities were conducted during my PhD activity. For the NEMO experiment a collaboration between the INFN/University groups of Catania and Bologna led to the development and production of a mixed signal acquisition board for the Nemo Km3 telescope. The research concerned the feasibility study for a different acquisition technique quite far from that adopted in the NEMO Phase 1 telescope. The DAQ board that we realized exploits the LIRA06 front-end chip for the analog acquisition of anodic an dynodic sources of a PMT (Photo-Multiplier Tube). The low-power analog acquisition allows to sample contemporaneously multiple channels of the PMT at different gain factors in order to increase the signal response linearity over a wider dynamic range. Also the auto triggering and self-event-classification features help to improve the acquisition performance and the knowledge on the neutrino event. A fully functional interface towards the first level data concentrator, the Floor Control Module, has been integrated as well on the board, and a specific firmware has been realized to comply with the present communication protocols. This stage of the project foresees the use of an FPGA, a high speed configurable device, to provide the board with a flexible digital logic control core. After the validation of the whole front-end architecture this feature would be probably integrated in a common mixed-signal ASIC (Application Specific Integrated Circuit). The volatile nature of the configuration memory of the FPGA implied the integration of a flash ISP (In System Programming) memory and a smart architecture for a safe remote reconfiguration of it. All the integrated features of the board have been tested. At the Catania laboratory the behavior of the LIRA chip has been investigated in the digital environment of the DAQ board and we succeeded in driving the acquisition with the FPGA. The PMT pulses generated with an arbitrary waveform generator were correctly triggered and acquired by the analog chip, and successively they were digitized by the on board ADC under the supervision of the FPGA. For the communication towards the data concentrator a test bench has been realized in Bologna where, thanks to a lending of the Roma University and INFN, a full readout chain equivalent to that present in the NEMO phase-1 was installed. These tests showed a good behavior of the digital electronic that was able to receive and to execute command imparted by the PC console and to answer back with a reply. The remotely configurable logic behaved well too and demonstrated, at least in principle, the validity of this technique. A new prototype board is now under development at the Catania laboratory as an evolution of the one described above. This board is going to be deployed within the NEMO Phase-2 tower in one of its floors dedicated to new front-end proposals. This board will integrate a new analog acquisition chip called SAS (Smart Auto-triggering Sampler) introducing thus a new analog front-end but inheriting most of the digital logic present in the current DAQ board discussed in this thesis. For what concern the activity on high-resolution vertex detectors, I worked within the SLIM5 collaboration for the characterization of a MAPS (Monolithic Active Pixel Sensor) device called APSEL-4D. The mentioned chip is a matrix of 4096 active pixel sensors with deep N-well implantations meant for charge collection and to shield the analog electronics from digital noise. The chip integrates the full-custom sensors matrix and the sparsifification/readout logic realized with standard-cells in STM CMOS technology 130 nm. For the chip characterization a test-beam has been set up on the 12 GeV PS (Proton Synchrotron) line facility at CERN of Geneva (CH). The collaboration prepared a silicon strip telescope and a DAQ system (hardware and software) for data acquisition and control of the telescope that allowed to store about 90 million events in 7 equivalent days of live-time of the beam. My activities concerned basically the realization of a firmware interface towards and from the MAPS chip in order to integrate it on the general DAQ system. Thereafter I worked on the DAQ software to implement on it a proper Slow Control interface of the APSEL4D. Several APSEL4D chips with different thinning have been tested during the test beam. Those with 100 and 300 um presented an overall efficiency of about 90% imparting a threshold of 450 electrons. The test-beam allowed to estimate also the resolution of the pixel sensor providing good results consistent with the pitch/sqrt(12) formula. The MAPS intrinsic resolution has been extracted from the width of the residual plot taking into account the multiple scattering effect.
Resumo:
Es wurde ein für bodengebundene Feldmessungen geeignetes System zur digital-holographischen Abbildung luftgetragener Objekte entwickelt und konstruiert. Es ist, abhängig von der Tiefenposition, geeignet zur direkten Bestimmung der Größe luftgetragener Objekte oberhalb von ca. 20 µm, sowie ihrer Form bei Größen oberhalb von ca. 100µm bis in den Millimeterbereich. Die Entwicklung umfaßte zusätzlich einen Algorithmus zur automatisierten Verbesserung der Hologrammqualität und zur semiautomatischen Entfernungsbestimmung großer Objekte entwickelt. Eine Möglichkeit zur intrinsischen Effizienzsteigerung der Bestimmung der Tiefenposition durch die Berechnung winkelgemittelter Profile wurde vorgestellt. Es wurde weiterhin ein Verfahren entwickelt, das mithilfe eines iterativen Ansatzes für isolierte Objekte die Rückgewinnung der Phaseninformation und damit die Beseitigung des Zwillingsbildes erlaubt. Weiterhin wurden mithilfe von Simulationen die Auswirkungen verschiedener Beschränkungen der digitalen Holographie wie der endlichen Pixelgröße untersucht und diskutiert. Die geeignete Darstellung der dreidimensionalen Ortsinformation stellt in der digitalen Holographie ein besonderes Problem dar, da das dreidimensionale Lichtfeld nicht physikalisch rekonstruiert wird. Es wurde ein Verfahren entwickelt und implementiert, das durch Konstruktion einer stereoskopischen Repräsentation des numerisch rekonstruierten Meßvolumens eine quasi-dreidimensionale, vergrößerte Betrachtung erlaubt. Es wurden ausgewählte, während Feldversuchen auf dem Jungfraujoch aufgenommene digitale Hologramme rekonstruiert. Dabei ergab sich teilweise ein sehr hoher Anteil an irregulären Kristallformen, insbesondere infolge massiver Bereifung. Es wurden auch in Zeiträumen mit formal eisuntersättigten Bedingungen Objekte bis hinunter in den Bereich ≤20µm beobachtet. Weiterhin konnte in Anwendung der hier entwickelten Theorie des ”Phasenrandeffektes“ ein Objekt von nur ca. 40µm Größe als Eisplättchen identifiziert werden. Größter Nachteil digitaler Holographie gegenüber herkömmlichen photographisch abbildenden Verfahren ist die Notwendigkeit der aufwendigen numerischen Rekonstruktion. Es ergibt sich ein hoher rechnerischer Aufwand zum Erreichen eines einer Photographie vergleichbaren Ergebnisses. Andererseits weist die digitale Holographie Alleinstellungsmerkmale auf. Der Zugang zur dreidimensionalen Ortsinformation kann der lokalen Untersuchung der relativen Objektabstände dienen. Allerdings zeigte sich, dass die Gegebenheiten der digitalen Holographie die Beobachtung hinreichend großer Mengen von Objekten auf der Grundlage einzelner Hologramm gegenwärtig erschweren. Es wurde demonstriert, dass vollständige Objektgrenzen auch dann rekonstruiert werden konnten, wenn ein Objekt sich teilweise oder ganz außerhalb des geometrischen Meßvolumens befand. Weiterhin wurde die zunächst in Simulationen demonstrierte Sub-Bildelementrekonstruktion auf reale Hologramme angewandt. Dabei konnte gezeigt werden, dass z.T. quasi-punktförmige Objekte mit Sub-Pixelgenauigkeit lokalisiert, aber auch bei ausgedehnten Objekten zusätzliche Informationen gewonnen werden konnten. Schließlich wurden auf rekonstruierten Eiskristallen Interferenzmuster beobachtet und teilweise zeitlich verfolgt. Gegenwärtig erscheinen sowohl kristallinterne Reflexion als auch die Existenz einer (quasi-)flüssigen Schicht als Erklärung möglich, wobei teilweise in Richtung der letztgenannten Möglichkeit argumentiert werden konnte. Als Ergebnis der Arbeit steht jetzt ein System zur Verfügung, das ein neues Meßinstrument und umfangreiche Algorithmen umfaßt. S. M. F. Raupach, H.-J. Vössing, J. Curtius und S. Borrmann: Digital crossed-beam holography for in-situ imaging of atmospheric particles, J. Opt. A: Pure Appl. Opt. 8, 796-806 (2006) S. M. F. Raupach: A cascaded adaptive mask algorithm for twin image removal and its application to digital holograms of ice crystals, Appl. Opt. 48, 287-301 (2009) S. M. F. Raupach: Stereoscopic 3D visualization of particle fields reconstructed from digital inline holograms, (zur Veröffentlichung angenommen, Optik - Int. J. Light El. Optics, 2009)
Resumo:
The aim of my thesis is to parallelize the Weighting Histogram Analysis Method (WHAM), which is a popular algorithm used to calculate the Free Energy of a molucular system in Molecular Dynamics simulations. WHAM works in post processing in cooperation with another algorithm called Umbrella Sampling. Umbrella Sampling has the purpose to add a biasing in the potential energy of the system in order to force the system to sample a specific region in the configurational space. Several N independent simulations are performed in order to sample all the region of interest. Subsequently, the WHAM algorithm is used to estimate the original system energy starting from the N atomic trajectories. The parallelization of WHAM has been performed through CUDA, a language that allows to work in GPUs of NVIDIA graphic cards, which have a parallel achitecture. The parallel implementation may sensibly speed up the WHAM execution compared to previous serial CPU imlementations. However, the WHAM CPU code presents some temporal criticalities to very high numbers of interactions. The algorithm has been written in C++ and executed in UNIX systems provided with NVIDIA graphic cards. The results were satisfying obtaining an increase of performances when the model was executed on graphics cards with compute capability greater. Nonetheless, the GPUs used to test the algorithm is quite old and not designated for scientific calculations. It is likely that a further performance increase will be obtained if the algorithm would be executed in clusters of GPU at high level of computational efficiency. The thesis is organized in the following way: I will first describe the mathematical formulation of Umbrella Sampling and WHAM algorithm with their apllications in the study of ionic channels and in Molecular Docking (Chapter 1); then, I will present the CUDA architectures used to implement the model (Chapter 2); and finally, the results obtained on model systems will be presented (Chapter 3).
Resumo:
Diese Arbeit stellt eine ausführliche Studie fundamentaler Eigenschaften der Kalzit CaCO3(10.4) und verwandter Mineraloberflächen dar, welche nicht nur durch die Verwendung von Nichtkontakt Rasterkraftmikroskopie, sondern hauptsächlich durch die Messung von Kraftfeldern ermöglicht wurde. Die absolute Oberflächenorientierung sowie der hierfür zugrundeliegende Prozess auf atomarer Skala konnten erfolgreich für die Kalzit (10.4) Oberfläche identifiziert werden.rnDie Adsorption chiraler Moleküle auf Kalzit ist relevant im Bereich der Biomineralisation, was ein Verständnis der Oberflächensymmetrie unumgänglich macht. Die Messung des Oberflächenkraftfeldes auf atomarer Ebene ist hierfür ein zentraler Aspekt. Eine solche Kraftkarte beleuchtet nicht nur die für die Biomineralisation wichtige Wechselwirkung der Oberfläche mit Molekülen, sondern enthält auch die Möglichkeit, Prozesse auf atomarer Skala und damit Oberflächeneigenschaften zu identifizieren.rnDie Einführung eines höchst flexiblen Messprotokolls gewährleistet die zuverlässige und kommerziell nicht erhältliche Messung des Oberflächenkraftfeldes. Die Konversion der rohen ∆f Daten in die vertikale Kraft Fz ist jedoch kein trivialer Vorgang, insbesondere wenn Glätten der Daten in Frage kommt. Diese Arbeit beschreibt detailreich, wie Fz korrekt für die experimentellen Bedingungen dieser Arbeit berechnet werden können. Weiterhin ist beschrieben, wie Lateralkräfte Fy und Dissipation Γ erhalten wurden, um das volle Potential dieser Messmethode auszureizen.rnUm Prozesse auf atomarer Skala auf Oberflächen zu verstehen sind die kurzreichweitigen, chemischen Kräfte Fz,SR von größter Wichtigkeit. Langreichweitige Beiträge müssen hierzu an Fz angefittet und davon abgezogen werden. Dies ist jedoch eine fehleranfällige Aufgabe, die in dieser Arbeit dadurch gemeistert werden konnte, dass drei unabhängige Kriterien gefunden wurden, die den Beginn zcut von Fz,SR bestimmen, was für diese Aufgabe von zentraler Bedeutung ist. Eine ausführliche Fehleranalyse zeigt, dass als Kriterium die Abweichung der lateralen Kräfte voneinander vertrauenswürdige Fz,SR liefert. Dies ist das erste Mal, dass in einer Studie ein Kriterium für die Bestimmung von zcut gegeben werden konnte, vervollständigt mit einer detailreichen Fehleranalyse.rnMit der Kenntniss von Fz,SR und Fy war es möglich, eine der fundamentalen Eigenschaften der CaCO3(10.4) Oberfläche zu identifizieren: die absolute Oberflächenorientierung. Eine starke Verkippung der abgebildeten Objekte
Resumo:
Capuchin monkeys are notable among New World monkeys for their widespread use of tools. They use both hammer tools and insertion tools in the wild to acquire food that would be unobtainable otherwise. Evidence indicates that capuchins transport stones to anvil sites and use the most functionally efficient stones to crack nuts. We investigated capuchins’ assessment of functionality by testing their ability to select a tool that was appropriate for two different tool-use tasks: A stone for a hammer task and a stick for an insertion task. To select the appropriate tools, the monkeys investigated a baited tool-use apparatus (insertion or hammer), traveled to a location in their enclosure where they could no longer see the apparatus, made a selection between two tools (stick or stone), and then could transport the tool back to the apparatus to obtain a walnut. Four capuchins were first trained to select and use the appropriate tool for each apparatus. After training, they were then tested by allowing them to view a baited apparatus and then travel to a location 8 m distant where they could select a tool while out of view of the apparatus. All four monkeys chose the correct tool significantly more than expected and transported the tools back to the apparatus. Results confirm capuchins’ propensity for transporting tools, demonstrate their capacity to select the functionally appropriate tool for two different tool-use tasks, and indicate that they can retain the memory of the correct choice during a travel time of several seconds.
Resumo:
Numerous studies have shown that animals have a sense of quantity and can distinguish between relative amounts. The concepts of relative numerousness, estimation, and subitizing are well established in species as diverse as chimpanzees and salamanders. Mobile animals have practical use for an understanding of number in common situations such as predation, mating, and competition. However, the ability to identify discrete quantities has only been firmly established in humans. The purpose of this study was to test for such “absolute numerousness” judgments in three lion-tailed macaques (Macaca silenus), a non-human primate. The three macaques tested had previously been trained on a computerized matchto- sample (MTS) task using geometric shapes. In this study, they were introduced to a MTS task containing a numerical cue, which required the monkeys to match stimuli containing either one or two items for rewards. If monkeys were successful at the initial matching task, they were tested with stimuli in which the position of the items and then the surface area of the items was controlled. If the monkeys could match successfully without using these non-numerical cues, they would demonstrate the capability to make absolute numerousness judgments. None of the monkeys matched successfully using the numerical cue, so no evidence of absolute numerosity was found. Each macaque progressed through the experiment in an individualized manner, attempting a variety of strategies to obtain rewards. These included side preferences and an alternating-side strategy that were unrelated to the numerical cues in the stimuli. When it became clear that the monkeys were not matching based on a stimulus-based cue, they were tested again on matching geometric shapes. All three macaques stopped using their alternate strategies and were able to match shapes successfully, demonstrating that they were still capable of completing the matching task. The data suggest that the monkeys could not transfer this ability to the numerical stimuli. This indicates that the macaques lack a sense of exact quantity, or that they could not recognize the numerical cues in the stimuli as being relevant to the task.
Resumo:
Micelle-forming bile salts have previously been shown to be effective pseudo-stationary phases for separating the chiral isomers of binaphthyl compounds with micellar electrokinetic capillary chromatography (MEKC). Here, cholate micelles are systematically investigated via electrophoretic separations and NMR using R, S-1, 1¿- binaphthyl- 2, 2¿-diylhydrogenphosphate (BNDHP) as a model chiral analyte. The pH, temperature, and concentration of BNDHP were systematically varied while monitoring the chiral resolution obtained with MEKC and the chemical shift of various protons in NMR. NMR data for each proton on BNDHP is monitored as a function of cholate concentration: as cholate monomers begin to aggregate and the analyte molecules begin to sample the micelle aggregate we observe changes in the cholate methyl and S-BNDHP proton chemical shifts. From such NMR data, the apparent CMC of cholate at pH 12 is found to be about 13-14 mM, but this value decreases at higher pH, suggesting that more extreme pHs may give rise to more effective separations. In general, CMCs increase with temperature indicating that one may be able to obtain better separations at lower temperatures. S-BNDHP concentrations ranging from 50 ¿M to 400 ¿M (pH 12.8) gave rise to apparent cholate CMC values from 10 mM to 8 mM, respectively, indicating that S-BNDHP, the chiral analyte molecule, may play an active role in stabilizing cholate aggregates. In all, these data show that NMR can be used to systematically investigate a complex multi-variable landscape of potential optimizations of chiral separations.