903 resultados para Advanced application and branching systems
Resumo:
We examined outcomes and trends in surgery and radiation use for patients with locally advanced esophageal cancer, for whom optimal treatment isn't clear. Trends in surgery and radiation for patients with T1-T3N1M0 squamous cell or adenocarcinoma of the mid or distal esophagus in the Surveillance, Epidemiology, and End Results database from 1998 to 2008 were analyzed using generalized linear models including year as predictor; Surveillance, Epidemiology, and End Results doesn't record chemotherapy data. Local treatment was unimodal if patients had only surgery or radiation and bimodal if they had both. Five-year cancer-specific survival (CSS) and overall survival (OS) were analyzed using propensity-score adjusted Cox proportional-hazard models. Overall 5-year survival for the 3295 patients identified (mean age 65.1 years, standard deviation 11.0) was 18.9% (95% confidence interval: 17.3-20.7). Local treatment was bimodal for 1274 (38.7%) and unimodal for 2021 (61.3%) patients; 1325 (40.2%) had radiation alone and 696 (21.1%) underwent only surgery. The use of bimodal therapy (32.8-42.5%, P = 0.01) and radiation alone (29.3-44.5%, P < 0.001) increased significantly from 1998 to 2008. Bimodal therapy predicted improved CSS (hazard ratios [HR]: 0.68, P < 0.001) and OS (HR: 0.58, P < 0.001) compared with unimodal therapy. For the first 7 months (before survival curve crossing), CSS after radiation therapy alone was similar to surgery alone (HR: 0.86, P = 0.12) while OS was worse for surgery only (HR: 0.70, P = 0.001). However, worse CSS (HR: 1.43, P < 0.001) and OS (HR: 1.46, P < 0.001) after that initial timeframe were found for radiation therapy only. The use of radiation to treat locally advanced mid and distal esophageal cancers increased from 1998 to 2008. Survival was best when both surgery and radiation were used.
Resumo:
Background and aims Differences in chemical composition of root compounds and root systems among tree species may affect organic matter (OM) distribution, source and composition in forest soils. The objective of this study was to elucidate the contribution of species specific cutin and suberin biomarkers as proxies for shoot- and root-derived organic carbon (OC) to soil OM at different depths with increasing distance to the stems of four different tree species. Methods The contribution of cutin- and suberin-derived lipids to OM in a Cutanic Alisol was analyzed with increasing soil depth and distance to the stems of Fagus sylvatica L., Picea abies (L.) Karst., Quercus robur L. and Pseudotsuga menziesii (Mirb.) Franco. Cutin and suberin monomers of plants and soils were analyzed by alkaline hydrolysis and subsequent gas chromatography–mass spectrometry. Results The amount and distribution of suberin-derived lipids in soil clearly reflected the specific root system of the different tree species. The amount of cutin-derived lipids decreased strongly with soil depth, indicating that the input of leaf/needle material is restricted to the topsoil. In contrast to the suberin-derived lipids, the spatial pattern of cutin monomer contribution to soil OM did not depend on tree species. Conclusions Our results document the importance of tree species as a main factor controlling the composition and distribution of OM in forest soils. They reveal the impact of tree species on root-derived OM distribution and the necessity to distinguish among different zones when studying soil OM storage in forests.
Resumo:
Digital Rights Management Systems (DRMS) are seen by content providers as the appropriate tool to, on the one hand, fight piracy and, on the other hand, monetize their assets. Although these systems claim to be very powerful and include multiple protection technologies, there is a lack of understanding about how such systems are currently being implemented and used by content providers. The aim of this paper is twofold. First, it provides a theoretical basis through which we present shortly the seven core protection technologies of a DRMS. Second, this paper provides empirical evidence that the seven protection technologies outlined in the first section of this paper are the most commonly used technologies. It further evaluates to what extent these technologies are being used within the music and print industry. It concludes that the three main Technologies are encryption, password, and payment systems. However, there are some industry differences: the number of protection technologies used, the requirements for a DRMS, the required investment, or the perceived success of DRMS in fighting piracy.
Resumo:
In this work, electrophoretic preconcentration of protein and peptide samples in microchannels was studied theoretically using the 1D dynamic simulator GENTRANS, and experimentally combined with MS. In all configurations studied, the sample was uniformly distributed throughout the channel before power application, and driving electrodes were used as microchannel ends. In the first part, previously obtained experimental results from carrier-free systems are compared to simulation results, and the effects of atmospheric carbon dioxide and impurities in the sample solution are examined. Simulation provided insight into the dynamics of the transport of all components under the applied electric field and revealed the formation of a pure water zone in the channel center. In the second part, the use of an IEF procedure with simple well defined amphoteric carrier components, i.e. amino acids, for concentration and fractionation of peptides was investigated. By performing simulations a qualitative description of the analyte behavior in this system was obtained. Neurotensin and [Glu1]-Fibrinopeptide B were separated by IEF in microchannels featuring a liquid lid for simple sample handling and placement of the driving electrodes. Component distributions in the channel were detected using MALDI- and nano-ESI-MS and data were in agreement with those obtained by simulation. Dynamic simulations are demonstrated to represent an effective tool to investigate the electrophoretic behavior of all components in the microchannel.
Resumo:
High-resolution, multichannel seismic data collected across the Great Bahama Bank margin and the adjacent Straits of Florida indicate that the deposition of Neogene-Quaternary strata in this transect are controlled by two sedimentation mechanisms: (1) west-dipping layers of the platform margin, which are a product of sea-level-controlled, platform-derived downslope sedimentation; and (2) east- or north-dipping drift deposits in the basinal areas, which are deposited by ocean currents. These two sediment systems are active simultaneously and interfinger at the toe-of-slope. The prograding system consists of sigmoidal clinoforms that advanced the margin some 25 km into the Straits of Florida. The foresets of the clinoforms are approximately 600 m high with variable slope angles that steepen significantly in the Pleistocene section. The seismic facies of the prograding clinoforms on the slope is characterized by dominant, partly chaotic, cut-and-fill geometries caused by submarine canyons that are oriented downslope. In the basin axis, seismic geometries and facies document deposition from and by currents. Most impressive is an 800-m-thick drift deposit at the confluence of the Santaren Channel and the Straits of Florida. This "Santaren Drift" is slightly asymmetric, thinning to the north. The drift displays a highly coherent seismic facies characterized by a continuous succession of reflections, indicating very regular sedimentation. Leg 166 of the Ocean Drilling Program (ODP) drilled a transect of five deep holes between 2 and 30 km from the modern platform margin and retrieved the sediments from both the slope and basin systems. The Neogene slope sediments consist of peri-platform oozes intercalated with turbidites, whereas the basinal drift deposits consist of more homogeneous, fine-grained carbonates that were deposited without major hiatuses by the Florida Current starting at approximately 12.4 Ma. Sea-level fluctuations, which controlled the carbonate production on Great Bahama Bank by repeated exposure of the platform top, controlled lithologic alternations and hiatuses in sedimentation across the transect. Both sedimentary systems are contained in 17 seismic sequences that were identified in the Neogene-Quaternary section. Seismic sequence boundaries were identified based on geometric unconformities beneath the Great Bahama Bank. All the sequence boundaries could be traced across the entire transect into the Straits of Florida. Biostratigraphic age determinations of seismic reflections indicate that the seismic reflections of sequence boundaries have chronostratigraphic significance across both depositional environments.
Resumo:
Training and assessment paradigms for laparoscopic surgical skills are evolving from traditional mentor–trainee tutorship towards structured, more objective and safer programs. Accreditation of surgeons requires reaching a consensus on metrics and tasks used to assess surgeons’ psychomotor skills. Ongoing development of tracking systems and software solutions has allowed for the expansion of novel training and assessment means in laparoscopy. The current challenge is to adapt and include these systems within training programs, and to exploit their possibilities for evaluation purposes. This paper describes the state of the art in research on measuring and assessing psychomotor laparoscopic skills. It gives an overview on tracking systems as well as on metrics and advanced statistical and machine learning techniques employed for evaluation purposes. The later ones have a potential to be used as an aid in deciding on the surgical competence level, which is an important aspect when accreditation of the surgeons in particular, and patient safety in general, are considered. The prospective of these methods and tools make them complementary means for surgical assessment of motor skills, especially in the early stages of training. Successful examples such as the Fundamentals of Laparoscopic Surgery should help drive a paradigm change to structured curricula based on objective parameters. These may improve the accreditation of new surgeons, as well as optimize their already overloaded training schedules.
Resumo:
Incorporating the possibility of attaching attributes to variables in a logic programming system has been shown to allow the addition of general constraint solving capabilities to it. This approach is very attractive in that by adding a few primitives any logic programming system can be turned into a generic constraint logic programming system in which constraint solving can be user deñned, and at source level - an extreme example of the "glass box" approach. In this paper we propose a different and novel use for the concept of attributed variables: developing a generic parallel/concurrent (constraint) logic programming system, using the same "glass box" flavor. We argüe that a system which implements attributed variables and a few additional primitives can be easily customized at source level to implement many of the languages and execution models of parallelism and concurrency currently proposed, in both shared memory and distributed systems. We illustrate this through examples and report on an implementation of our ideas.
Resumo:
We discuss from a practical point of view a number of issues involved in writing Internet and WWW applications using LP/CLP systems. We describe Pd_l_oW, a public-domain Internet and WWW programming library for LP/CLP systems which we argüe significantly simplifies the process of writing such applications. Pd_l_oW provides facilities for generating HTML structured documents, producing HTML forms, writing form handlers, accessing and parsing WWW documents, and accessing code posted at HTTP addresses. We also describe the architecture of some application classes, using a high-level model of client-server interaction, active modules. We then propose an architecture for automatic LP/CLP code downloading for local execution, using generic browsers. Finally, we also provide an overview of related work on the topic. The PiLLoW library has been developed in the context of the &- Prolog and CIAO systems, but it has been adapted to a number of popular LP/CLP systems, supporting most of its functionality.
Resumo:
Incorporating the possibility of attaching attributes to variables in a logic programming system has been shown to allow the addition of general constraint solving capabilities to it. This approach is very attractive in that by adding a few primitives any logic programming system can be turned into a generic constraint logic programming system in which constraint solving can be user defined, and at source level - an extreme example of the "glass box" approach. In this paper we propose a different and novel use for the concept of attributed variables: developing a generic parallel/concurrent (constraint) logic programming system, using the same "glass box" flavor. We argüe that a system which implements attributed variables and a few additional primitives can be easily customized at source level to implement many of the languages and execution models of parallelism and concurrency currently proposed, in both shared memory and distributed systems. We illustrate this through examples.
Resumo:
An efficient approach is presented to improve the local and global approximation and modelling capability of Takagi-Sugeno (T-S) fuzzy model. The main aim is obtaining high function approximation accuracy. The main problem is that T-S identification method cannot be applied when the membership functions are overlapped by pairs. This restricts the use of the T-S method because this type of membership function has been widely used during the last two decades in the stability, controller design and are popular in industrial control applications. The approach developed here can be considered as a generalized version of T-S method with optimized performance in approximating nonlinear functions. A simple approach with few computational effort, based on the well known parameters' weighting method is suggested for tuning T-S parameters to improve the choice of the performance index and minimize it. A global fuzzy controller (FC) based Linear Quadratic Regulator (LQR) is proposed in order to show the effectiveness of the estimation method developed here in control applications. Illustrative examples of an inverted pendulum and Van der Pol system are chosen to evaluate the robustness and remarkable performance of the proposed method and the high accuracy obtained in approximating nonlinear and unstable systems locally and globally in comparison with the original T-S model. Simulation results indicate the potential, simplicity and generality of the algorithm.
Resumo:
Soft-rot Enterobacteriaceae (SRE), which belong to the genera Pectobacterium and Dickeya, consist mainly of broad host-range pathogens that cause wilt, rot, and blackleg diseases on a wide range of plants. They are found in plants, insects, soil, and water in agricultural regions worldwide. SRE encode all six known protein secretion systems present in gram-negative bacteria, and these systems are involved in attacking host plants and competing bacteria. They also produce and detect multiple types of small molecules to coordinate pathogenesis, modify the plant environment, attack competing microbes, and perhaps to attract insect vectors. This review integrates new information about the role protein secretion and detection and production of ions and small molecules play in soft-rot pathogenicity.
Resumo:
This paper is about analysis and assess of three experiences on telematic and electronic voting dealing with such aspects as security and achievement of the social requirements. These experiences have been chosen taking into account the deepness of the public documentation and the technological challenge they faces.
Resumo:
This paper is about analysis and assess of three experiences on telematic and electronic voting dealing with such aspects as security and achievement of the social requirements. These experiences have been chosen taking into account the deepness of the public documentation and the technological challenge they faces.
Resumo:
This paper is on homonymous distributed systems where processes are prone to crash failures and have no initial knowledge of the system membership (?homonymous? means that several processes may have the same identi?er). New classes of failure detectors suited to these systems are ?rst de?ned. Among them, the classes H? and H? are introduced that are the homonymous counterparts of the classes ? and ?, respectively. (Recall that the pair h?,?i de?nes the weakest failure detector to solve consensus.) Then, the paper shows how H? and H? can be implemented in homonymous systems without membership knowledge (under different synchrony requirements). Finally, two algorithms are presented that use these failure detectors to solve consensus in homonymous asynchronous systems where there is no initial knowledge ofthe membership. One algorithm solves consensus with hH?, H?i, while the other uses only H?, but needs a majority of correct processes. Observe that the systems with unique identi?ers and anonymous systems are extreme cases of homonymous systems from which follows that all these results also apply to these systems. Interestingly, the new failure detector class H? can be implemented with partial synchrony, while the analogous class A? de?ned for anonymous systems can not be implemented (even in synchronous systems). Hence, the paper provides us with the ?rst proof showing that consensus can be solved in anonymous systems with only partial synchrony (and a majority of correct processes).
Resumo:
This paper focuses on the general problem of coordinating multiple robots. More specifically, it addresses the self-selection of heterogeneous specialized tasks by autonomous robots. In this paper we focus on a specifically distributed or decentralized approach as we are particularly interested in a decentralized solution where the robots themselves autonomously and in an individual manner, are responsible for selecting a particular task so that all the existing tasks are optimally distributed and executed. In this regard, we have established an experimental scenario to solve the corresponding multi-task distribution problem and we propose a solution using two different approaches by applying Response Threshold Models as well as Learning Automata-based probabilistic algorithms. We have evaluated the robustness of the algorithms, perturbing the number of pending loads to simulate the robot’s error in estimating the real number of pending tasks and also the dynamic generation of loads through time. The paper ends with a critical discussion of experimental results.