881 resultados para Minimization of open stack problem


Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Interleukin-1 is pivotal in the pathogenesis of systemic juvenile idiopathic arthritis (JIA). We assessed the efficacy and safety of canakinumab, a selective, fully human, anti-interleukin-1β monoclonal antibody, in two trials. METHODS: In trial 1, we randomly assigned patients, 2 to 19 years of age, with systemic JIA and active systemic features (fever; ≥2 active joints; C-reactive protein, >30 mg per liter; and glucocorticoid dose, ≤1.0 mg per kilogram of body weight per day), in a double-blind fashion, to a single subcutaneous dose of canakinumab (4 mg per kilogram) or placebo. The primary outcome, termed adapted JIA ACR 30 response, was defined as improvement of 30% or more in at least three of the six core criteria for JIA, worsening of more than 30% in no more than one of the criteria, and resolution of fever. In trial 2, after 32 weeks of open-label treatment with canakinumab, patients who had a response and underwent glucocorticoid tapering were randomly assigned to continued treatment with canakinumab or to placebo. The primary outcome was time to flare of systemic JIA. RESULTS: At day 15 in trial 1, more patients in the canakinumab group had an adapted JIA ACR 30 response (36 of 43 [84%], vs. 4 of 41 [10%] in the placebo group; P<0.001). In trial 2, among the 100 patients (of 177 in the open-label phase) who underwent randomization in the withdrawal phase, the risk of flare was lower among patients who continued to receive canakinumab than among those who were switched to placebo (74% of patients in the canakinumab group had no flare, vs. 25% in the placebo group, according to Kaplan-Meier estimates; hazard ratio, 0.36; P=0.003). The average glucocorticoid dose was reduced from 0.34 to 0.05 mg per kilogram per day, and glucocorticoids were discontinued in 42 of 128 patients (33%). The macrophage activation syndrome occurred in 7 patients; infections were more frequent with canakinumab than with placebo. CONCLUSIONS: These two phase 3 studies show the efficacy of canakinumab in systemic JIA with active systemic features. (Funded by Novartis Pharma; ClinicalTrials.gov numbers, NCT00889863 and NCT00886769.).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new initiative has sprung on the path created by the Open Access (OA) movement: Open Education (OE). The initiative's aim is to open up all educational resources at all learning levels. In order to achieve this goal, several international institutions, like UNESCO and the OECD, have published reports, surveys and documents to help educational institutions in this endeavor. This global initiative needs a legal framework; as a result, efforts thus far have usually resorted to Open Licensing (OL), especially Creative Commons (CC) licensing. In fact, as a response to this new movement, Creative Commons launched a new program, ccLearn , which recognizes open licensing's impact on education and directly supports the idea of open educational resources (OER). However, there still remain a good amount of open questions: What is happening locally with OL in higher education? How are educational institutions receiving the initiative? How is it that the OL initiative relates to educational resources? Are there local examples of open educational resources (OER)? How do these local instances incorporate CC into their educational frameworks?. To this effect, this analysis aims to focus on the legal approach and specifically on the way the educational sector is using open licenses outside the English speaking world. It will do so by looking at the current situation in two specific scenarios, the Colombian and the Catalan experiences with open educational projects at the higher education level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Interleukin-1 is pivotal in the pathogenesis of systemic juvenile idiopathic arthritis (JIA). We assessed the efficacy and safety of canakinumab, a selective, fully human, anti-interleukin-1β monoclonal antibody, in two trials. METHODS: In trial 1, we randomly assigned patients, 2 to 19 years of age, with systemic JIA and active systemic features (fever; ≥2 active joints; C-reactive protein, >30 mg per liter; and glucocorticoid dose, ≤1.0 mg per kilogram of body weight per day), in a double-blind fashion, to a single subcutaneous dose of canakinumab (4 mg per kilogram) or placebo. The primary outcome, termed adapted JIA ACR 30 response, was defined as improvement of 30% or more in at least three of the six core criteria for JIA, worsening of more than 30% in no more than one of the criteria, and resolution of fever. In trial 2, after 32 weeks of open-label treatment with canakinumab, patients who had a response and underwent glucocorticoid tapering were randomly assigned to continued treatment with canakinumab or to placebo. The primary outcome was time to flare of systemic JIA. RESULTS: At day 15 in trial 1, more patients in the canakinumab group had an adapted JIA ACR 30 response (36 of 43 [84%], vs. 4 of 41 [10%] in the placebo group; P<0.001). In trial 2, among the 100 patients (of 177 in the open-label phase) who underwent randomization in the withdrawal phase, the risk of flare was lower among patients who continued to receive canakinumab than among those who were switched to placebo (74% of patients in the canakinumab group had no flare, vs. 25% in the placebo group, according to Kaplan-Meier estimates; hazard ratio, 0.36; P=0.003). The average glucocorticoid dose was reduced from 0.34 to 0.05 mg per kilogram per day, and glucocorticoids were discontinued in 42 of 128 patients (33%). The macrophage activation syndrome occurred in 7 patients; infections were more frequent with canakinumab than with placebo. CONCLUSIONS: These two phase 3 studies show the efficacy of canakinumab in systemic JIA with active systemic features. (Funded by Novartis Pharma; ClinicalTrials.gov numbers, NCT00889863 and NCT00886769.).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two shallow water late Cenomanian to early Turonian sequences of NE Egypt have been investigated to evaluate the response to OAE2. Age control based on calcareous nannoplankton, planktic foraminifera and ammonite biostratigraphies integrated with delta(13)C stratigraphy is relatively good despite low diversity and sporadic occurrences. Planktic and benthic foraminiferal faunas are characterized by dysoxic, brackish and mesotrophic conditions, as indicated by low species diversity, low oxygen and low salinity tolerant planktic and benthic species, along with oyster-rich limestone layers. In these subtidal to inner neritic environments the OAE2 delta(13)C excursion appears comparable and coeval to that of open marine environments. However, in contrast to open marine environments where anoxic conditions begin after the first delta(13)C peak and end at or near the Cenomanian-Turonian boundary, in shallow coastal environments anoxic conditions do not appear until the early Turonian. This delay in anoxia appears to be related to the sea-level transgression that reached its maximum in the early Turonian, as observed in shallow water sections from Egypt to Morocco. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Worldwide initiatives toward digital library (DL) support for electronic theses and dissertations (ETDs), facilitated by the work of the Networked Digital Library of Theses and Dissertations (NDLTD), are a key part of the move toward open access. When all graduate students learn to use openly available ETDs, and have experience with authoring and submission in connection with their own research results, it will be easy for them to continue these efforts through other contributions to open access. When all universities support ETD activities, they will be key participants in institutional repositories and open access, and will have engaged in discussion and infrastructure development supportive of further open access activities. Understanding of open access also can be facilitated through modeling of all of these efforts using the 5S framework, considering the key aspects of DL development: Societies, Scenarios, Spaces, Structures, and Streams.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Interest in the use of ground rubber from used tires as a hot asphalt mix binder has been increasing due to the magnitude of the disposal problem posed by the annual addition of millions of waste tires to the refuse stream. This study evaluates, through laboratory means, the performance of asphalt-rubber as a hot mix binder as compared to conventional asphalt. The results indicate that asphalt-rubber outperforms its base asphalt in mixes of identical gradation and comparable void content on tests that are heavily dependent on binder characteristics (resilient modulus and indirect tension). An appreciable increase in rut resistance due to the use of asphalt-rubber is not indicated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, it has become apparent that the design and maintenance of pavement drainage extends the service life of pavements. Most pavement structures now incorporate subsurface layers. Part of the function of these subsurface layers is to drain away excess water, which can be extremely deleterious to the life of the pavement. To assure the effectiveness of such drainage layers after they have been spread and compacted, simple, rapid, in-situ permeability and stability testing and end-result specification are needed. This report includes conclusions and recommendations related to four main study objectives: (1) Determine the optimal range for in-place stability and in-place permeability based on Iowa aggregate sources; (2) Evaluate the feasibility of an air permeameter for determining the permeability of open and well-graded drainage layers in situ; (3) Develop reliable end-result quality control/quality assurance specifications for stability and permeability; and (4) Refine aggregate placement and construction methods to optimize uniformity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Measuring the height of the vertical jump is an indicator of the strength and power of the lower body. The technological tools available to measure the vertical jump are black boxes and are not open to third-party verification or adaptation. We propose the creation of a measurement system called Chronojump-Boscosystem, consisting of open hardware and free software. Methods: A microcontroller was created and validated using a square wave generator and an oscilloscope. Two types of contact platforms were developed using different materials. These platforms were validated by the minimum pressure required for activation at different points by a strain gauge, together with the on/off time of our platforms in respect of the Ergojump-Boscosystem platform by a sample of 8 subjects performing submaximal jumps with one foot on each platform. Agile methodologies were used to develop and validate the software. Results: All the tools fall under the free software / open hardware guidelines and are, in that sense, free. The microcontroller margin of error is 0.1%. The validity of the fiberglass platform is 0.95 (ICC). The management software contains nearly 113.000 lines of code and is available in 7 languages.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A parametric procedure for the blind inversion of nonlinear channels is proposed, based on a recent method of blind source separation in nonlinear mixtures. Experiments show that the proposed algorithms perform efficiently, even in the presence of hard distortion. The method, based on the minimization of the output mutual information, needs the knowledge of log-derivative of input distribution (the so-called score function). Each algorithm consists of three adaptive blocks: one devoted to adaptive estimation of the score function, and two other blocks estimating the inverses of the linear and nonlinear parts of the channel, (quasi-)optimally adapted using the estimated score functions. This paper is mainly concerned by the nonlinear part, for which we propose two parametric models, the first based on a polynomial model and the second on a neural network, while [14, 15] proposed non-parametric approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: The long-term outcome of antiretroviral therapy (ART) is not assessed in controlled trials. We aimed to analyse trends in the population effectiveness of ART in the Swiss HIV Cohort Study over the last decade. METHODS: We analysed the odds of stably suppressed viral load (ssVL: three consecutive values <50 HIV-1 RNA copies/mL) and of CD4 cell count exceeding 500 cells/μL for each year between 2000 and 2008 in three scenarios: an open cohort; a closed cohort ignoring the influx of new participants after 2000; and a worst-case closed cohort retaining lost or dead patients as virological failures in subsequent years. We used generalized estimating equations with sex, age, risk, non-White ethnicity and era of starting combination ART (cART) as fixed co-factors. Time-updated co-factors included type of ART regimen, number of new drugs and adherence to therapy. RESULTS: The open cohort included 9802 individuals (median age 38 years; 31% female). From 2000 to 2008, the proportion of participants with ssVL increased from 37 to 64% [adjusted odds ratio (OR) per year 1.16 (95% CI 1.15-1.17)] and the proportion with CD4 count >500 cells/μL increased from 40 to >50% [OR 1.07 (95% CI 1.06-1.07)]. Similar trends were seen in the two closed cohorts. Adjustment did not substantially affect time trends. CONCLUSIONS: There was no relevant dilution effect through new participants entering the open clinical cohort, and the increase in virological/immunological success over time was not an artefact of the study design of open cohorts. This can partly be explained by new treatment options and other improvements in medical care.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many questions in evolutionary biology require an estimate of divergence times but, for groups with a sparse fossil record, such estimates rely heavily on molecular dating methods. The accuracy of these methods depends on both an adequate underlying model and the appropriate implementation of fossil evidence as calibration points. We explore the effect of these in Poaceae (grasses), a diverse plant lineage with a very limited fossil record, focusing particularly on dating the early divergences in the group. We show that molecular dating based on a data set of plastid markers is strongly dependent on the model assumptions. In particular, an acceleration of evolutionary rates at the base of Poaceae followed by a deceleration in the descendants strongly biases methods that assume an autocorrelation of rates. This problem can be circumvented by using markers that have lower rate variation, and we show that phylogenetic markers extracted from complete nuclear genomes can be a useful complement to the more commonly used plastid markers. However, estimates of divergence times remain strongly affected by different implementations of fossil calibration points. Analyses calibrated with only macrofossils lead to estimates for the age of core Poaceae ∼51-55 Ma, but the inclusion of microfossil evidence pushes this age to 74-82 Ma and leads to lower estimated evolutionary rates in grasses. These results emphasize the importance of considering markers from multiple genomes and alternative fossil placements when addressing evolutionary issues that depend on ages estimated for important groups.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The primary objective of this project is to develop a design manual that would aid the county or municipal engineer in making structurally sound bridge strengthening or replacement decisions. The contents of this progress report are related only to Phase I of the study and deal primarily with defining the extent of the bridge problem in Iowa. In addition, the types of bridges to which the manual should be directed have been defined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Recent research based on comparisons between bilinguals and monolinguals postulates that bilingualism enhances cognitive control functions, because the parallel activation of languages necessitates control of interference. In a novel approach we investigated two groups of bilinguals, distinguished by their susceptibility to cross-language interference, asking whether bilinguals with strong language control abilities ('non-switchers") have an advantage in executive functions (inhibition of irrelevant information, problem solving, planning efficiency, generative fluency and self-monitoring) compared to those bilinguals showing weaker language control abilities ('switchers"). Methods: 29 late bilinguals (21 women) were evaluated using various cognitive control neuropsychological tests [e.g., Tower of Hanoi, Ruff Figural Fluency Task, Divided Attention, Go/noGo] tapping executive functions as well as four subtests of the Wechsler Adult Intelligence Scale. The analysis involved t-tests (two independent samples). Non-switchers (n = 16) were distinguished from switchers (n = 13) by their performance observed in a bilingual picture-naming task. Results: The non-switcher group demonstrated a better performance on the Tower of Hanoi and Ruff Figural Fluency task, faster reaction time in a Go/noGo and Divided Attention task, and produced significantly fewer errors in the Tower of Hanoi, Go/noGo, and Divided Attention tasks when compared to the switchers. Non-switchers performed significantly better on two verbal subtests of the Wechsler Adult Intelligence Scale (Information and Similarity), but not on the Performance subtests (Picture Completion, Block Design). Conclusions: The present results suggest that bilinguals with stronger language control have indeed a cognitive advantage in the administered tests involving executive functions, in particular inhibition, self-monitoring, problem solving, and generative fluency, and in two of the intelligence tests. What remains unclear is the direction of the relationship between executive functions and language control abilities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Combinatorial optimization involves finding an optimal solution in a finite set of options; many everyday life problems are of this kind. However, the number of options grows exponentially with the size of the problem, such that an exhaustive search for the best solution is practically infeasible beyond a certain problem size. When efficient algorithms are not available, a practical approach to obtain an approximate solution to the problem at hand, is to start with an educated guess and gradually refine it until we have a good-enough solution. Roughly speaking, this is how local search heuristics work. These stochastic algorithms navigate the problem search space by iteratively turning the current solution into new candidate solutions, guiding the search towards better solutions. The search performance, therefore, depends on structural aspects of the search space, which in turn depend on the move operator being used to modify solutions. A common way to characterize the search space of a problem is through the study of its fitness landscape, a mathematical object comprising the space of all possible solutions, their value with respect to the optimization objective, and a relationship of neighborhood defined by the move operator. The landscape metaphor is used to explain the search dynamics as a sort of potential function. The concept is indeed similar to that of potential energy surfaces in physical chemistry. Borrowing ideas from that field, we propose to extend to combinatorial landscapes the notion of the inherent network formed by energy minima in energy landscapes. In our case, energy minima are the local optima of the combinatorial problem, and we explore several definitions for the network edges. At first, we perform an exhaustive sampling of local optima basins of attraction, and define weighted transitions between basins by accounting for all the possible ways of crossing the basins frontier via one random move. Then, we reduce the computational burden by only counting the chances of escaping a given basin via random kick moves that start at the local optimum. Finally, we approximate network edges from the search trajectory of simple search heuristics, mining the frequency and inter-arrival time with which the heuristic visits local optima. Through these methodologies, we build a weighted directed graph that provides a synthetic view of the whole landscape, and that we can characterize using the tools of complex networks science. We argue that the network characterization can advance our understanding of the structural and dynamical properties of hard combinatorial landscapes. We apply our approach to prototypical problems such as the Quadratic Assignment Problem, the NK model of rugged landscapes, and the Permutation Flow-shop Scheduling Problem. We show that some network metrics can differentiate problem classes, correlate with problem non-linearity, and predict problem hardness as measured from the performances of trajectory-based local search heuristics.