912 resultados para Numerical Algorithms and Problems
Resumo:
Recently telecommunication industry benefits from infrastructure sharing, one of the most fundamental enablers of cloud computing, leading to emergence of the Mobile Virtual Network Operator (MVNO) concept. The most momentous intents by this approach are the support of on-demand provisioning and elasticity of virtualized mobile network components, based on data traffic load. To realize it, during operation and management procedures, the virtualized services need be triggered in order to scale-up/down or scale-out/in an instance. In this paper we propose an architecture called MOBaaS (Mobility and Bandwidth Availability Prediction as a Service), comprising two algorithms in order to predict user(s) mobility and network link bandwidth availability, that can be implemented in cloud based mobile network structure and can be used as a support service by any other virtualized mobile network services. MOBaaS can provide prediction information in order to generate required triggers for on-demand deploying, provisioning, disposing of virtualized network components. This information can be used for self-adaptation procedures and optimal network function configuration during run-time operation, as well. Through the preliminary experiments with the prototype implementation on the OpenStack platform, we evaluated and confirmed the feasibility and the effectiveness of the prediction algorithms and the proposed architecture.
Resumo:
BACKGROUND Implant-overdentures supported by rigid bars provide stability in the edentulous atrophic mandible. However, fractures of solder joints and matrices, and loosening of screws and matrices were observed with soldered gold bars (G-bars). Computer-aided designed/computer-assisted manufactured (CAD/CAM) titanium bars (Ti-bars) may reduce technical complications due to enhanced material quality. PURPOSE To compare prosthetic-technical maintenance service of mandibular implant-overdentures supported by CAD/CAM Ti-bar and soldered G-bar. MATERIALS AND METHODS Edentulous patients were consecutively admitted for implant-prosthodontic treatment with a maxillary complete denture and a mandibular implant-overdenture connected to a rigid G-bar or Ti-bar. Maintenance service and problems with the implant-retention device complex and the prosthesis were recorded during minimally 3-4 years. Annual peri-implant crestal bone level changes (ΔBIC) were radiographically assessed. RESULTS Data of 213 edentulous patients (mean age 68 ± 10 years), who had received a total of 477 tapered implants, were available. Ti-bar and G-bar comprised 101 and 112 patients with 231 and 246 implants, respectively. Ti-bar mostly exhibited distal bar extensions (96%) compared to 34% of G-bar (p < .001). Fracture rate of bars extensions (4.7% vs 14.8%, p < .001) and matrices (1% vs 13%, p < .001) was lower for Ti-bar. Matrices activation was required 2.4× less often in Ti-bar. ΔBIC remained stable for both groups. CONCLUSIONS Implant overdentures supported by soldered gold bars or milled CAD/CAM Ti-bars are a successful treatment modality but require regular maintenance service. These short-term observations support the hypothesis that CAD/CAM Ti-bars reduce technical complications. Fracture location indicated that the titanium thickness around the screw-access hole should be increased.
Resumo:
BACKGROUND: HIV surveillance requires monitoring of new HIV diagnoses and differentiation of incident and older infections. In 2008, Switzerland implemented a system for monitoring incident HIV infections based on the results of a line immunoassay (Inno-Lia) mandatorily conducted for HIV confirmation and type differentiation (HIV-1, HIV-2) of all newly diagnosed patients. Based on this system, we assessed the proportion of incident HIV infection among newly diagnosed cases in Switzerland during 2008-2013. METHODS AND RESULTS: Inno-Lia antibody reaction patterns recorded in anonymous HIV notifications to the federal health authority were classified by 10 published algorithms into incident (up to 12 months) or older infections. Utilizing these data, annual incident infection estimates were obtained in two ways, (i) based on the diagnostic performance of the algorithms and utilizing the relationship 'incident = true incident + false incident', (ii) based on the window-periods of the algorithms and utilizing the relationship 'Prevalence = Incidence x Duration'. From 2008-2013, 3'851 HIV notifications were received. Adult HIV-1 infections amounted to 3'809 cases, and 3'636 of them (95.5%) contained Inno-Lia data. Incident infection totals calculated were similar for the performance- and window-based methods, amounting on average to 1'755 (95% confidence interval, 1588-1923) and 1'790 cases (95% CI, 1679-1900), respectively. More than half of these were among men who had sex with men. Both methods showed a continuous decline of annual incident infections 2008-2013, totaling -59.5% and -50.2%, respectively. The decline of incident infections continued even in 2012, when a 15% increase in HIV notifications had been observed. This increase was entirely due to older infections. Overall declines 2008-2013 were of similar extent among the major transmission groups. CONCLUSIONS: Inno-Lia based incident HIV-1 infection surveillance proved useful and reliable. It represents a free, additional public health benefit of the use of this relatively costly test for HIV confirmation and type differentiation.
Resumo:
As a complement to experimental and theoretical approaches, numerical modeling has become an important component to study asteroid collisions and impact processes. In the last decade, there have been significant advances in both computational resources and numerical methods. We discuss the present state-of-the-art numerical methods and material models used in "shock physics codes" to simulate impacts and collisions and give some examples of those codes. Finally, recent modeling studies are presented, focussing on the effects of various material properties and target structures on the outcome of a collision.
Resumo:
Information-centric networking (ICN) is a new communication paradigm that has been proposed to cope with drawbacks of host-based communication protocols, namely scalability and security. In this thesis, we base our work on Named Data Networking (NDN), which is a popular ICN architecture, and investigate NDN in the context of wireless and mobile ad hoc networks. In a first part, we focus on NDN efficiency (and potential improvements) in wireless environments by investigating NDN in wireless one-hop communication, i.e., without any routing protocols. A basic requirement to initiate informationcentric communication is the knowledge of existing and available content names. Therefore, we develop three opportunistic content discovery algorithms and evaluate them in diverse scenarios for different node densities and content distributions. After content names are known, requesters can retrieve content opportunistically from any neighbor node that provides the content. However, in case of short contact times to content sources, content retrieval may be disrupted. Therefore, we develop a requester application that keeps meta information of disrupted content retrievals and enables resume operations when a new content source has been found. Besides message efficiency, we also evaluate power consumption of information-centric broadcast and unicast communication. Based on our findings, we develop two mechanisms to increase efficiency of information-centric wireless one-hop communication. The first approach called Dynamic Unicast (DU) avoids broadcast communication whenever possible since broadcast transmissions result in more duplicate Data transmissions, lower data rates and higher energy consumption on mobile nodes, which are not interested in overheard Data, compared to unicast communication. Hence, DU uses broadcast communication only until a content source has been found and then retrieves content directly via unicast from the same source. The second approach called RC-NDN targets efficiency of wireless broadcast communication by reducing the number of duplicate Data transmissions. In particular, RC-NDN is a Data encoding scheme for content sources that increases diversity in wireless broadcast transmissions such that multiple concurrent requesters can profit from each others’ (overheard) message transmissions. If requesters and content sources are not in one-hop distance to each other, requests need to be forwarded via multi-hop routing. Therefore, in a second part of this thesis, we investigate information-centric wireless multi-hop communication. First, we consider multi-hop broadcast communication in the context of rather static community networks. We introduce the concept of preferred forwarders, which relay Interest messages slightly faster than non-preferred forwarders to reduce redundant duplicate message transmissions. While this approach works well in static networks, the performance may degrade in mobile networks if preferred forwarders may regularly move away. Thus, to enable routing in mobile ad hoc networks, we extend DU for multi-hop communication. Compared to one-hop communication, multi-hop DU requires efficient path update mechanisms (since multi-hop paths may expire quickly) and new forwarding strategies to maintain NDN benefits (request aggregation and caching) such that only a few messages need to be transmitted over the entire end-to-end path even in case of multiple concurrent requesters. To perform quick retransmission in case of collisions or other transmission errors, we implement and evaluate retransmission timers from related work and compare them to CCNTimer, which is a new algorithm that enables shorter content retrieval times in information-centric wireless multi-hop communication. Yet, in case of intermittent connectivity between requesters and content sources, multi-hop routing protocols may not work because they require continuous end-to-end paths. Therefore, we present agent-based content retrieval (ACR) for delay-tolerant networks. In ACR, requester nodes can delegate content retrieval to mobile agent nodes, which move closer to content sources, can retrieve content and return it to requesters. Thus, ACR exploits the mobility of agent nodes to retrieve content from remote locations. To enable delay-tolerant communication via agents, retrieved content needs to be stored persistently such that requesters can verify its authenticity via original publisher signatures. To achieve this, we develop a persistent caching concept that maintains received popular content in repositories and deletes unpopular content if free space is required. Since our persistent caching concept can complement regular short-term caching in the content store, it can also be used for network caching to store popular delay-tolerant content at edge routers (to reduce network traffic and improve network performance) while real-time traffic can still be maintained and served from the content store.
Resumo:
Historically morphological features were used as the primary means to classify organisms. However, the age of molecular genetics has allowed us to approach this field from the perspective of the organism's genetic code. Early work used highly conserved sequences, such as ribosomal RNA. The increasing number of complete genomes in the public data repositories provides the opportunity to look not only at a single gene, but at organisms' entire parts list. ^ Here the Sequence Comparison Index (SCI) and the Organism Comparison Index (OCI), algorithms and methods to compare proteins and proteomes, are presented. The complete proteomes of 104 sequenced organisms were compared. Over 280 million full Smith-Waterman alignments were performed on sequence pairs which had a reasonable expectation of being related. From these alignments a whole proteome phylogenetic tree was constructed. This method was also used to compare the small subunit (SSU) rRNA from each organism and a tree constructed from these results. The SSU rRNA tree by the SCI/OCI method looks very much like accepted SSU rRNA trees from sources such as the Ribosomal Database Project, thus validating the method. The SCI/OCI proteome tree showed a number of small but significant differences when compared to the SSU rRNA tree and proteome trees constructed by other methods. Horizontal gene transfer does not appear to affect the SCI/OCI trees until the transferred genes make up a large portion of the proteome. ^ As part of this work, the Database of Related Local Alignments (DaRLA) was created and contains over 81 million rows of sequence alignment information. DaRLA, while primarily used to build the whole proteome trees, can also be applied shared gene content analysis, gene order analysis, and creating individual protein trees. ^ Finally, the standard BLAST method for analyzing shared gene content was compared to the SCI method using 4 spirochetes. The SCI system performed flawlessly, finding all proteins from one organism against itself and finding all the ribosomal proteins between organisms. The BLAST system missed some proteins from its respective organism and failed to detect small ribosomal proteins between organisms. ^
Resumo:
Considerable evidence suggests that central cholinergic neurons participate in either acquisition, storage or retrieval of information. Experiments were designed to evaluate information processing in mice following either reversible or irreversible impairment in central cholinergic activity. The cholinergic receptor antagonists, atropine and methylatropine were used to reversibly inhibit cholinergic transmission. Irreversible impairment in central cholinergic function was achieved by central administration of the cholinergic-specific neurotoxins, N-ethyl-choline aziridinium (ECA) and N-ethyl-acetylcholine aziridinium (EACA).^ ECA and EACA appear to act by irreversible inhibition of high affinity choline uptake (proposed rate-limiting step in acetylcholine synthesis). Intraventricular administration of ECA or EACA produced persistent reduction in hippocampal choline acetyltransferase activity. Other neuronal systems and brain regions showed no evidence of toxicity.^ Mice treated with either ECA or EACA showed behavioral deficits associated with cholinergic dysfunction. Passive avoidance behavior was significantly impaired by cholinotoxin treatment. Radial arm maze performance was also significantly impaired in cholinotoxin-treated animals. Deficits in radial arm maze performance were transient, however, such that rapid and apparent complete behavioral recovery was seen during retention testing. The centrally active cholinergic receptor antagonist atropine also caused significant impairment in radial arm maze behavior, while equivalent doses of methylatropine were without effect.^ The relative effects of cholinotoxin and receptor antagonist treatment on short-term (working) memory and long-term (reference) memory in radial arm maze behavior were examined. Maze rotation studies indicated that there were at least two different response strategies which could result in accurate maze performance. One strategy involved the use of response algorithms and was considered to be a function of reference memory. Another strategy appeared to be primarily dependent on spatial working memory. However, all behavioral paradigms with multiple trails have reference memory requirements (i.e. information useful over all trials). Performance was similarly affected following either cholinotoxin or anticholinergic treatment, regardless of the response strategy utilized. In addition, rates of behavioral recovery following cholinotoxin treatment were similar between response groups. It was concluded that both cholinotoxin and anticholinergic treatment primarily resulted in impaired reference memory processes. ^
Resumo:
The effectiveness of the Anisotropic Analytical Algorithm (AAA) implemented in the Eclipse treatment planning system (TPS) was evaluated using theRadiologicalPhysicsCenteranthropomorphic lung phantom using both flattened and flattening-filter-free high energy beams. Radiation treatment plans were developed following the Radiation Therapy Oncology Group and theRadiologicalPhysicsCenterguidelines for lung treatment using Stereotactic Radiation Body Therapy. The tumor was covered such that at least 95% of Planning Target Volume (PTV) received 100% of the prescribed dose while ensuring that normal tissue constraints were followed as well. Calculated doses were exported from the Eclipse TPS and compared with the experimental data as measured using thermoluminescence detectors (TLD) and radiochromic films that were placed inside the phantom. The results demonstrate that the AAA superposition-convolution algorithm is able to calculate SBRT treatment plans with all clinically used photon beams in the range from 6 MV to 18 MV. The measured dose distribution showed a good agreement with the calculated distribution using clinically acceptable criteria of ±5% dose or 3mm distance to agreement. These results show that in a heterogeneous environment a 3D pencil beam superposition-convolution algorithms with Monte Carlo pre-calculated scatter kernels, such as AAA, are able to reliably calculate dose, accounting for increased lateral scattering due to the loss of electronic equilibrium in low density medium. The data for high energy plans (15 MV and 18 MV) showed very good tumor coverage in contrast to findings by other investigators for less sophisticated dose calculation algorithms, which demonstrated less than expected tumor doses and generally worse tumor coverage for high energy plans compared to 6MV plans. This demonstrates that the modern superposition-convolution AAA algorithm is a significant improvement over previous algorithms and is able to calculate doses accurately for SBRT treatment plans in the highly heterogeneous environment of the thorax for both lower (≤12 MV) and higher (greater than 12 MV) beam energies.
Resumo:
We have measured the carbon isotopic composition of dissolved inorganic carbon in bottom waters of the Ontong Java Plateau (western equatorial Pacific) and on the northern Emperor Seamounts (northwest Pacific). Each of these locations is several hundred miles from the nearest Geochemical Ocean Sections Study (GEOSECS) stations, and the observed delta13C values at each site differ substantially from regionally averaged GEOSECS delta13C profiles. We discuss the possible causes of these differences, including horizontal variability, near-bottom effects, and problems with the Pacific GEOSECS delta13C data. We also measured the isotopic composition (C and O) of core top C. wuellerstorfi from a depth transect of cores at each location. The delta18O data are used to verify that our samples are Holocene. Comparison of foraminiferal and bottom water delta13C values shows that this species faithfully records bottom water delta13C at both sites and demonstrates that there is no depth-related artifact in the dissolved inorganic carbon-C. wuellerstorfi delta13C relationship at these sites.
Resumo:
Heavy (magnetic & non-magnetic) minerals are found concentrated by natural processes in many fluvial, estuarine, coastal and shelf environments with a potential to form economic placer deposits. Understanding the processes of heavy mineral transport and enrichment is prerequisite to interpret sediment magnetic properties in terms of hydro- and sediment dynamics. In this study, we combine rock magnetic and sedimentological laboratory measurements with numerical 3D discrete element models to investigate differential grain entrainment and transport rates of magnetic minerals in a range of coastal environments (riverbed, mouth, estuary, beach and near-shore). We analyzed grain-size distributions of representative bulk samples and their magnetic mineral fractions to relate grain-size modes to respective transport modes (traction, saltation, suspension). Rock magnetic measurements showed that distribution shapes, population sizes and grain-size offsets of bulk and magnetic mineral fractions hold information on the transport conditions and enrichment process in each depositional environment. A downstream decrease in magnetite grain size and an increase in magnetite concentration was observed from riverine source to marine sink environments. Lower flow velocities permit differential settling of light and heavy mineral grains creating heavy mineral enriched zones in estuary settings, while lighter minerals are washed out further into the sea. Numerical model results showed that higher heavy mineral concentrations in the bed increased the erosion rate and enhancing heavy mineral enrichment. In beach environments where sediments contained light and heavy mineral grains of equivalent grain sizes, the bed was found to be more stable with negligible amount of erosion compared to other bed compositions. Heavy mineral transport rates calculated for four different bed compositions showed that increasing heavy mineral content in the bed decreased the transport rate. There is always a lag in transport between light and heavy minerals which increases with higher heavy mineral concentration in all tested bed compositions. The results of laboratory experiments were validated by numerical models and showed good agreement. We demonstrate that the presented approach bears the potential to investigate heavy mineral enrichment processes in a wide range of sedimentary settings.
Resumo:
New data are reported on structure of sections, chemical composition, and age of volcano-sedimentary and volcanic rocks from the Sinii Utes Depression in the Southern Primorye region. The Sinii Utes Depression is filled with two sequences: the lower sequence composed of sedimentary-volcanogenic coaliferous rocks (the stratotype of the Sinii Utes Formation) and the upper sequence consisting of tephroid with overlying basalts. This work considers chemical composition and problems of K-Ar dating of basalts. The uppermost basaltic flow has K-Ar age 22.0±1.0 Ma. The dates obtained for the middle and upper parts of lava flows are underestimated. It is explained by their heating due to combustion of brown coals of the Sinii Utes Formation underlying the lava flow. Calculations show that argon could only partly have been removed from the basalts owing to conductive heat transfer and was lost largely due to infiltration of hot gases in heterogeneous fissured medium. Basaltic volcanism on continental margins of the southern Primorye region and the adjacent Korean and Chinese areas at the Oligocene-Miocene boundary preceded Early-Middle Miocene spreading and formation of the Sea of Japan basin. Undifferentiated moderately alkaline basalts of intraplate affinity developed in the Amba Depression and some other structures of the southern Primorye region and intraplate alkali basalts of the Phohang Graben in the Korean Peninsula serve as indicators of incipient spreading regime in the Sea of Japan. Potassic basalt-trachybasalt eruptions occurred locally in riftogenic depressions and shield volcanoes. In some structures this volcanism was terminated by eruptions of intermediate and acid lavas. Such evolution of volcanism is explained by selective contamination of basaltic melts during their interaction with crustal acid material and generation of acid anatectic melts.
Resumo:
Most current methods of reconstructing past sea levels within Antarctica rely on radiocarbon dating. However, radiocarbon dating is limited by the availability of material for dating and problems inherent with radiocarbon reservoirs in Antarctic marine systems. Here we report on the success of a new approach to dating raised beach deposits in Antarctica for the purpose of reconstructing past sea levels. This new approach is the use of optically stimulated luminescence (OSL) on quartz-grains obtained from the underside of cobbles within raised beaches and boulder pavements. We obtained eight OSL dates from three sites along the shores of Maxwell Bay in the South Shetland Islands of the Antarctic Peninsula. These dates are internally consistent and fit well with previously published radiocarbon ages obtained from the same deposits. In addition, when the technique was applied to a modern beach, it resulted in an age of zero. Our results suggest that this method will provide a valuable tool in the reconstruction of past sea levels in Antarctica and other coarse-grained beach deposits across the globe.
Resumo:
Monitoring the impact of sea storms on coastal areas is fundamental to study beach evolution and the vulnerability of low-lying coasts to erosion and flooding. Modelling wave runup on a beach is possible, but it requires accurate topographic data and model tuning, that can be done comparing observed and modeled runup. In this study we collected aerial photos using an Unmanned Aerial Vehicle after two different swells on the same study area. We merged the point cloud obtained with photogrammetry with multibeam data, in order to obtain a complete beach topography. Then, on each set of rectified and georeferenced UAV orthophotos, we identified the maximum wave runup for both events recognizing the wet area left by the waves. We then used our topography and numerical models to simulate the wave runup and compare the model results to observed values during the two events. Our results highlight the potential of the methodology presented, which integrates UAV platforms, photogrammetry and Geographic Information Systems to provide faster and cheaper information on beach topography and geomorphology compared with traditional techniques without losing in accuracy. We use the results obtained from this technique as a topographic base for a model that calculates runup for the two swells. The observed and modeled runups are consistent, and open new directions for future research.
Resumo:
The Shelf Seas of the Arctic are known for their large sea-ice production. This paper presents a comprehensive view of the Kara Sea sea-ice cover from high-resolution numerical modeling and space-borne microwave radiometry. As given by the latter the average polynya area in the Kara Sea takes a value of 21.2 × 10**3 km**2 ± 9.1 × 10**3 km**2 for winters (Jan.-Apr.) 1996/97 to 2000/01, being as high as 32.0 × 10**3 km**2 in 1999/2000 and below 12 × 10**3 km**2 in 1998/99. Day-to-day variations of the Kara Sea polynya area can be as high as 50 × 10**3 km**2. For the seasons 1996/97 to 2000/01 the modeled cumulative winter ice-volume flux out of the Kara Sea varied between 100 km**3/a and 350 km**3/a. Modeled high (low) ice export coincides with a high (low) average and cumulative polynya area, and with a low (high) sea-ice compactness in the Kara Sea from remote sensing data, and with a high (low) sea-ice drift speed across its northern boundary derived from independent model data for the winters 1996/97 to 2000/01.
Resumo:
The CoastColour project Round Robin (CCRR) project (http://www.coastcolour.org) funded by the European Space Agency (ESA) was designed to bring together a variety of reference datasets and to use these to test algorithms and assess their accuracy for retrieving water quality parameters. This information was then developed to help end-users of remote sensing products to select the most accurate algorithms for their coastal region. To facilitate this, an inter-comparison of the performance of algorithms for the retrieval of in-water properties over coastal waters was carried out. The comparison used three types of datasets on which ocean colour algorithms were tested. The description and comparison of the three datasets are the focus of this paper, and include the Medium Resolution Imaging Spectrometer (MERIS) Level 2 match-ups, in situ reflectance measurements and data generated by a radiative transfer model (HydroLight). The datasets mainly consisted of 6,484 marine reflectance associated with various geometrical (sensor viewing and solar angles) and sky conditions and water constituents: Total Suspended Matter (TSM) and Chlorophyll-a (CHL) concentrations, and the absorption of Coloured Dissolved Organic Matter (CDOM). Inherent optical properties were also provided in the simulated datasets (5,000 simulations) and from 3,054 match-up locations. The distributions of reflectance at selected MERIS bands and band ratios, CHL and TSM as a function of reflectance, from the three datasets are compared. Match-up and in situ sites where deviations occur are identified. The distribution of the three reflectance datasets are also compared to the simulated and in situ reflectances used previously by the International Ocean Colour Coordinating Group (IOCCG, 2006) for algorithm testing, showing a clear extension of the CCRR data which covers more turbid waters.