985 resultados para Detection Probability
Resumo:
Aerial surveys of narwhals (Monodon monoceros) were conducted in the Canadian High Arctic during the month of August from 2002 to 2004. The surveys covered the waters of Barrow Strait, Prince Regent Inlet, the Gulf of Boothia, Admiralty Inlet, Eclipse Sound, and the eastern coast of Baffin Island, using systematic sampling methods. Fiords were flown along a single transect down the middle. Near-surface population estimates increased by 1.9%-8.7% when corrected for perception bias. The estimates were further increased by a factor of approximately 3, to account for individuals not seen because they were diving when the survey plane flew over (availability bias). These corrections resulted in estimates of 27 656 (SE = 14 939) for the Prince Regent and Gulf of Boothia area, 20 225 (SE = 7285) for the Eclipse Sound area, and 10 073 (SE = 3123) for the East Baffin Island fiord area. The estimate for the Admiralty Inlet area was 5362 (SE = 2681) but is thought to be biased. Surveys could not be done in other known areas of occupation, such as the waters of the Cumberland Peninsula of East Baffin, and channels farther west of the areas surveyed (Peel Sound, Viscount Melville Sound, Smith Sound and Jones Sound, and other channels of the Canadian Arctic archipelago). Despite these probable biases and the incomplete coverage, results of these surveys show that the summering range of narwhals in the Canadian High Arctic is vast. If narwhals are philopatric to their summering areas, as they appear to be, the total population of that range could number more than 60 000 animals. The largest numbers are in the western portion of their summer range, around Somerset Island, and also in the Eclipse Sound area. However, these survey estimates have large variances due to narwhal aggregation in some parts of the surveyed areas.
Resumo:
Funding — Forest Enterprise Scotland and the University of Aberdeen provided funding for the project. The Carnegie Trust supported the lead author, E. McHenry, in this research through the award of a tuition fees bursary.
Resumo:
Estimates of abundance or density are essential for wildlife management and conservation. There are few effective density estimates for the Buff-throated Partridge Tetraophasis szechenyii, a rare and elusive high-mountain Galliform species endemic to western China. In this study, we used the temporary emigration N-mixture model to estimate density of this species, with data acquired from playback point count surveys around a sacred area based on indigenous Tibetan culture of protection of wildlife, in Yajiang County, Sichuan, China, during April–June 2009. Within 84 125-m radius points, we recorded 53 partridge groups during three repeats. The best model indicated that detection probability was described by covariates of vegetation cover type, week of visit, time of day, and weather with weak effects, and a partridge group was present during a sampling period with a constant probability. The abundance component was accounted for by vegetation association. Abundance was substantially higher in rhododendron shrubs, fir-larch forests, mixed spruce-larch-birch forests, and especially oak thickets than in pine forests. The model predicted a density of 5.14 groups/km², which is similar to an estimate of 4.7 – 5.3 groups/km² quantified via an intensive spot-mapping effort. The post-hoc estimate of individual density was 14.44 individuals/km², based on the estimated mean group size of 2.81. We suggest that the method we employed is applicable to estimate densities of Buff-throated Partridges in large areas. Given importance of a mosaic habitat for this species, local logging should be regulated. Despite no effect of the conservation area (sacred) on the abundance of Buff-throated Partridges, we suggest regulations linking the sacred mountain conservation area with the official conservation system because of strong local participation facilitated by sacred mountains in land conservation.
Resumo:
The importance of non-destructive techniques (NDT) in structural health monitoring programmes is being critically felt in the recent times. The quality of the measured data, often affected by various environmental conditions can be a guiding factor in terms usefulness and prediction efficiencies of the various detection and monitoring methods used in this regard. Often, a preprocessing of the acquired data in relation to the affecting environmental parameters can improve the information quality and lead towards a significantly more efficient and correct prediction process. The improvement can be directly related to the final decision making policy about a structure or a network of structures and is compatible with general probabilistic frameworks of such assessment and decision making programmes. This paper considers a preprocessing technique employed for an image analysis based structural health monitoring methodology to identify sub-marine pitting corrosion in the presence of variable luminosity, contrast and noise affecting the quality of images. A preprocessing of the gray-level threshold of the various images is observed to bring about a significant improvement in terms of damage detection as compared to an automatically computed gray-level threshold. The case dependent adjustments of the threshold enable to obtain the best possible information from an existing image. The corresponding improvements are observed in a qualitative manner in the present study.
Resumo:
Marine mammals exploit the efficiency of sound propagation in the marine environment for essential activities like communication and navigation. For this reason, passive acoustics has particularly high potential for marine mammal studies, especially those aimed at population management and conservation. Despite the rapid realization of this potential through a growing number of studies, much crucial information remains unknown or poorly understood. This research attempts to address two key knowledge gaps, using the well-studied bottlenose dolphin (Tursiops truncatus) as a model species, and underwater acoustic recordings collected on four fixed autonomous sensors deployed at multiple locations in Sarasota Bay, Florida, between September 2012 and August 2013. Underwater noise can hinder dolphin communication. The ability of these animals to overcome this obstacle was examined using recorded noise and dolphin whistles. I found that bottlenose dolphins are able to compensate for increased noise in their environment using a wide range of strategies employed in a singular fashion or in various combinations, depending on the frequency content of the noise, noise source, and time of day. These strategies include modifying whistle frequency characteristics, increasing whistle duration, and increasing whistle redundancy. Recordings were also used to evaluate the performance of six recently developed passive acoustic abundance estimation methods, by comparing their results to the true abundance of animals, obtained via a census conducted within the same area and time period. The methods employed were broadly divided into two categories – those involving direct counts of animals, and those involving counts of cues (signature whistles). The animal-based methods were traditional capture-recapture, spatially explicit capture-recapture (SECR), and an approach that blends the “snapshot” method and mark-recapture distance sampling, referred to here as (SMRDS). The cue-based methods were conventional distance sampling (CDS), an acoustic modeling approach involving the use of the passive sonar equation, and SECR. In the latter approach, detection probability was modelled as a function of sound transmission loss, rather than the Euclidean distance typically used. Of these methods, while SMRDS produced the most accurate estimate, SECR demonstrated the greatest potential for broad applicability to other species and locations, with minimal to no auxiliary data, such as distance from sound source to detector(s), which is often difficult to obtain. This was especially true when this method was compared to traditional capture-recapture results, which greatly underestimated abundance, despite attempts to account for major unmodelled heterogeneity. Furthermore, the incorporation of non-Euclidean distance significantly improved model accuracy. The acoustic modelling approach performed similarly to CDS, but both methods also strongly underestimated abundance. In particular, CDS proved to be inefficient. This approach requires at least 3 sensors for localization at a single point. It was also difficult to obtain accurate distances, and the sample size was greatly reduced by the failure to detect some whistles on all three recorders. As a result, this approach is not recommended for marine mammal abundance estimation when few recorders are available, or in high sound attenuation environments with relatively low sample sizes. It is hoped that these results lead to more informed management decisions, and therefore, more effective species conservation.
Resumo:
Massive Internet of Things is expected to play a crucial role in Beyond 5G (B5G) wireless communication systems, offering seamless connectivity among heterogeneous devices without human intervention. However, the exponential proliferation of smart devices and IoT networks, relying solely on terrestrial networks, may not fully meet the demanding IoT requirements in terms of bandwidth and connectivity, especially in areas where terrestrial infrastructures are not economically viable. To unleash the full potential of 5G and B5G networks and enable seamless connectivity everywhere, the 3GPP envisions the integration of Non-Terrestrial Networks (NTNs) into the terrestrial ones starting from Release 17. However, this integration process requires modifications to the 5G standard to ensure reliable communications despite typical satellite channel impairments. In this framework, this thesis aims at proposing techniques at the Physical and Medium Access Control layers that require minimal adaptations in the current NB-IoT standard via NTN. Thus, firstly the satellite impairments are evaluated and, then, a detailed link budget analysis is provided. Following, analyses at the link and the system levels are conducted. In the former case, a novel algorithm leveraging time-frequency analysis is proposed to detect orthogonal preambles and estimate the signals’ arrival time. Besides, the effects of collisions on the detection probability and Bit Error Rate are investigated and Non-Orthogonal Multiple Access approaches are proposed in the random access and data phases. The system analysis evaluates the performance of random access in case of congestion. Various access parameters are tested in different satellite scenarios, and the performance is measured in terms of access probability and time required to complete the procedure. Finally, a heuristic algorithm is proposed to jointly design the access and data phases, determining the number of satellite passages, the Random Access Periodicity, and the number of uplink repetitions that maximize the system's spectral efficiency.
Resumo:
The application of automatic segmentation methods in lesion detection is desirable. However, such methods are restricted by intensity similarities between lesioned and healthy brain tissue. Using multi-spectral magnetic resonance imaging (MRI) modalities may overcome this problem but it is not always practicable. In this article, a lesion detection approach requiring a single MRI modality is presented, which is an improved method based on a recent publication. This new method assumes that a low similarity should be found in the regions of lesions when the likeness between an intensity based fuzzy segmentation and a location based tissue probabilities is measured. The usage of a normalized similarity measurement enables the current method to fine-tune the threshold for lesion detection, thus maximizing the possibility of reaching high detection accuracy. Importantly, an extra cleaning step is included in the current approach which removes enlarged ventricles from detected lesions. The performance investigation using simulated lesions demonstrated that not only the majority of lesions were well detected but also normal tissues were identified effectively. Tests on images acquired in stroke patients further confirmed the strength of the method in lesion detection. When compared with the previous version, the current approach showed a higher sensitivity in detecting small lesions and had less false positives around the ventricle and the edge of the brain
Resumo:
We analyze theoretically the interplay between optical return-to-zero signal degradation due to timing jitter and additive amplified-spontaneous-emission noise. The impact of these two factors on the performance of a square-law direct detection receiver is also investigated. We derive an analytical expression for the bit-error probability and quantitatively determine the conditions when the contributions of the effects of timing jitter and additive noise to the bit error rate can be treated separately. The analysis of patterning effects is also presented. © 2007 IEEE.
Resumo:
Alternative splicing of gene transcripts greatly expands the functional capacity of the genome, and certain splice isoforms may indicate specific disease states such as cancer. Splice junction microarrays interrogate thousands of splice junctions, but data analysis is difficult and error prone because of the increased complexity compared to differential gene expression analysis. We present Rank Change Detection (RCD) as a method to identify differential splicing events based upon a straightforward probabilistic model comparing the over-or underrepresentation of two or more competing isoforms. RCD has advantages over commonly used methods because it is robust to false positive errors due to nonlinear trends in microarray measurements. Further, RCD does not depend on prior knowledge of splice isoforms, yet it takes advantage of the inherent structure of mutually exclusive junctions, and it is conceptually generalizable to other types of splicing arrays or RNA-Seq. RCD specifically identifies the biologically important cases when a splice junction becomes more or less prevalent compared to other mutually exclusive junctions. The example data is from different cell lines of glioblastoma tumors assayed with Agilent microarrays.
Resumo:
The production of conditional quantum states and quantum operations based on the result of measurement is now seen as a key tool in quantum information and metrology. We propose a different type of photon number detector. It functions nondeterministically, but when successful, it has high fidelity. The detector, which makes use of an n-photon auxiliary Fock state and high efficiency homodyne detection, allows a tunable trade-off between fidelity and probability. By sacrificing probability of operation, an excellent approximation to a photon-number detector is achieved.
Resumo:
A passive haemagglutination test (PHA) for human neurocysticercosis was standardized and evaluated for the detection of specific antibodies to Cysticercus cellulosae in cerebrospinal fluid (CSF). For the assay, formaldehyde-treated group O Rh-human red cells coated with the cysticerci crude total saline extract (TS) antigen were employed. A total of 115 CSF samples from patients with neurocysticercosis was analysed, of these 94 presented reactivity, corresponding to 81.7% sensitivity, in which confidence limit of 95% probability (CL95%) ranged from 74.5% to 88.9%. Eighty-nine CSF samples derived from individuals of control group presented as nonreactive in 94.4% (CL95% from 89.6% to 99.2%). The positive and negative predictive values were 1.4% and 99.9%, respectively, considering the mean rate of that this assay provide a rapid, highly reproducible, and moderately sensitive mean of detecting specific antibodies in CSF samples.
Resumo:
Eighty-one cerebrospinal fluid (CSF) samples mainly from cases of aseptic meningitis and motor deficiency syndrome were sent to the Virology Section of Evandro Chagas Institute, Belém Pará, in the period of January 1995 to January 1996 in order to isolate viruses. All samples were inoculated onto HEp-2 cell culture and newborn mice, with negative results. The probability of isolating viruses by these methods is reduced because of the low concentration of viral particles in these specimens. In order to obtain more information about the etiology of these cases, a group of 23 samples were selected to be tested by a more sensitive technique than the virus isolation - the reverse transcription polymerase chain reaction (RT-PCR). Specific primers directed to conserved regions in the enterovirus genome were used, considering that this group of viruses is frequently associated with these neurological disorder. The age of the patients ranged from 1 to 55 years and nearly all of them lived in Belém, State of Pará, North of Brazil. Of 15 samples analyzed by RT PCR nine (60%) were positive; of these, 6 (66.6%) had motor deficiency and 3 (33.3%) developed aseptic meningitis. These results show that it is important to investigate enterovirus as cause of these syndromes.
Resumo:
In recent years, vehicular cloud computing (VCC) has emerged as a new technology which is being used in wide range of applications in the area of multimedia-based healthcare applications. In VCC, vehicles act as the intelligent machines which can be used to collect and transfer the healthcare data to the local, or global sites for storage, and computation purposes, as vehicles are having comparatively limited storage and computation power for handling the multimedia files. However, due to the dynamic changes in topology, and lack of centralized monitoring points, this information can be altered, or misused. These security breaches can result in disastrous consequences such as-loss of life or financial frauds. Therefore, to address these issues, a learning automata-assisted distributive intrusion detection system is designed based on clustering. Although there exist a number of applications where the proposed scheme can be applied but, we have taken multimedia-based healthcare application for illustration of the proposed scheme. In the proposed scheme, learning automata (LA) are assumed to be stationed on the vehicles which take clustering decisions intelligently and select one of the members of the group as a cluster-head. The cluster-heads then assist in efficient storage and dissemination of information through a cloud-based infrastructure. To secure the proposed scheme from malicious activities, standard cryptographic technique is used in which the auotmaton learns from the environment and takes adaptive decisions for identification of any malicious activity in the network. A reward and penalty is given by the stochastic environment where an automaton performs its actions so that it updates its action probability vector after getting the reinforcement signal from the environment. The proposed scheme was evaluated using extensive simulations on ns-2 with SUMO. The results obtained indicate that the proposed scheme yields an improvement of 10 % in detection rate of malicious nodes when compared with the existing schemes.
Resumo:
IEEE 802.11 is one of the most well-established and widely used standard for wireless LAN. Its Medium Access control (MAC) layer assumes that the devices adhere to the standard’s rules and timers to assure fair access and sharing of the medium. However, wireless cards driver flexibility and configurability make it possible for selfish misbehaving nodes to take advantages over the other well-behaving nodes. The existence of selfish nodes degrades the QoS for the other devices in the network and may increase their energy consumption. In this paper we propose a green solution for selfish misbehavior detection in IEEE 802.11-based wireless networks. The proposed scheme works in two phases: Global phase which detects whether the network contains selfish nodes or not, and Local phase which identifies which node or nodes within the network are selfish. Usually, the network must be frequently examined for selfish nodes during its operation since any node may act selfishly. Our solution is green in the sense that it saves the network resources as it avoids wasting the nodes energy by examining all the individual nodes of being selfish when it is not necessary. The proposed detection algorithm is evaluated using extensive OPNET simulations. The results show that the Global network metric clearly indicates the existence of a selfish node while the Local nodes metric successfully identified the selfish node(s). We also provide mathematical analysis for the selfish misbehaving and derived formulas for the successful channel access probability.