970 resultados para Practical problems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article seeks to promote discussion about scholarship of teaching in Australian postgraduate pre-admission practical legal training (PLT). This is germane to perceptions of the quality of accreditation of young Australian lawyers practicing in a globalised profession. The article gives a definition and outlines the prerequisites for scholarship of teaching. The present position of teacher engagement with scholarship of teaching in Australian PLT is considered, together with the historical and organisational epistemological approaches to professional practical training. Problems of validity, measurement, performativity, and engagement in teaching scholarship are discussed. Possible methodological approaches, including Schön’s conception of action research, together with other methodologies, technologies, and practical considerations, are considered. These discussion points are directed toward future exploration of PLT teachers’ engagement with, and leadership in, the scholarship of teaching in PLT.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Seawater desalination has significantly developed towards membrane technology than phase change process during last decade. Seawater reverse osmosis (SWRO) in general is the most familiar process due to higher water recovery and lower energy consumption compared to other available desalination processes. Despite major advancements in SWRO technology, desalination industry is still facing significant amount of practical issues. Therefore, the potentials and problems faced by current SWRO industries and essential study areas are discussed in this review for the benefit of desalination industry. It is important to consider all the following five components in SWRO process i.e. (1) intake (2) pre-treatment (3) high pressure pumping (4) membrane separation (performance of membranes and brine disposal) and (5) product quality. Development of higher corrosion resistant piping materials or coating materials, valves, and pumps is believed to be in higher research demand. Furthermore, brine management, that includes brine disposal and resource recovery need further attention. Pre-treatment sludge management and reduced cleaning in place flush volume will reduce the capital costs associated with evaporation ponds and the maintenance costs associated with disposal and transportation reducing the unit cost of water. © 2013 Springer Science+Business Media Dordrecht.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel common Tabu algorithm for global optimizations of engineering problems is presented. The robustness and efficiency of the presented method are evaluated by using standard mathematical functions and hy solving a practical engineering problem. The numerical results show that the proposed method is (i) superior to the conventional Tabu search algorithm in robustness, and (ii) superior to the simulated annealing algorithm in efficiency. (C) 2001 Elsevier B.V. B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neutron dosimetry using natural uranium and thorium thin films makes possible that mineral dating by the fission-track method can be accomplished, even when poor thermalized neutron facilities are employed. In this case, the contributions of the fissions of (235)U, (238)U and (232)Th induced by thermal, epithermal and fast neutrons to the population of tracks produced during irradiation are quantified through the combined use of natural uranium and thorium films.If the Th/U ratio of the sample is known, only one irradiation (where the sample and the films of uranium and thorium are present) is necessary to perform the dating. However, if that ratio is unknown, it can be determined through another irradiation where the mineral to be dated and both films are placed inside a cadmium box.Problems related with film manufacturing and calibration are discussed. Special attention is given to the utilization of thin films having very low uranium content. The problems faced suggest that it may be better to substitute these films by uranium doped standard glasses calibrated with thicker uranium films (thickness greater than 1.5 x 10(13) mu m).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In practical situations, the dynamics of the forcing function on a vibrating system cannot be considered as given a priori, and it must be taken as a consequence of the dynamics of the whole system. In other words, the forcing source has limited power, as that provided by a DC motor for an example, and thus its own dynamics is influenced by that of the vibrating system being forced. This increases the number of degrees of freedom of the problem, and it is called a non-ideal problem. In this work, we considerer two non-ideal problems analyzed by using numerical simulations. The existence of the Sommerfeld effect was verified, that is, the effect of getting stuck at resonance (energy imparted to the DC motor being used to excite large amplitude motions of the supporting structure). We considered two kinds of non-ideal problem: one related to the transverse vibrations of a shaft carrying two disks and another to a piezoceramic bar transducer powered by a vacuum tube generated by a non-ideal source Copyright © 2007 by ASME.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents a new and simple electrode, which may be used to achieve the electrochemical response of ground solids or insoluble samples. Ore samples from Morro Velho Mine (Brazil) were employed to exemplify the use of such electrodes. The new electrode avoids the use of binders or other agents overcoming major deterioration problems. (C) 2002 Published by Elsevier Science B.V.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Function approximation is a very important task in environments where the computation has to be based on extracting information from data samples in real world processes. So, the development of new mathematical model is a very important activity to guarantee the evolution of the function approximation area. In this sense, we will present the Polynomials Powers of Sigmoid (PPS) as a linear neural network. In this paper, we will introduce one series of practical results for the Polynomials Powers of Sigmoid, where we will show some advantages of the use of the powers of sigmiod functions in relationship the traditional MLP-Backpropagation and Polynomials in functions approximation problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper deals with the numerical solution of complex fluid dynamics problems using a new bounded high resolution upwind scheme (called SDPUS-C1 henceforth), for convection term discretization. The scheme is based on TVD and CBC stability criteria and is implemented in the context of the finite volume/difference methodologies, either into the CLAWPACK software package for compressible flows or in the Freeflow simulation system for incompressible viscous flows. The performance of the proposed upwind non-oscillatory scheme is demonstrated by solving two-dimensional compressible flow problems, such as shock wave propagation and two-dimensional/axisymmetric incompressible moving free surface flows. The numerical results demonstrate that this new cell-interface reconstruction technique works very well in several practical applications. (C) 2012 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background Physical attributes of the places in which people live, as well as their perceptions of them, may be important health determinants. The perception of place in which people dwell may impact on individual health and may be a more telling indicator for individual health than objective neighborhood characteristics. This paper aims to evaluate psychometric and ecometric properties of a scale on the perceptions of neighborhood problems in adults from Florianopolis, Southern Brazil. Methods Individual, census tract level (per capita monthly familiar income) and neighborhood problems perception (physical and social disorders) variables were investigated. Multilevel models (items nested within persons, persons nested within neighborhoods) were run to assess ecometric properties of variables assessing neighborhood problems. Results The response rate was 85.3%, (1,720 adults). Participants were distributed in 63 census tracts. Two scales were identified using 16 items: Physical Problems and Social Disorder. The ecometric properties of the scales satisfactory: 0.24 to 0.28 for the intra-class correlation and 0.94 to 0.96 for reliability. Higher values on the scales of problems in the physical and social domains were associated with younger age, more length of time residing in the same neighborhood and lower census tract income level. Conclusions The findings support the usefulness of these scales to measure physical and social disorder problems in neighborhoods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the past decade, the advent of efficient genome sequencing tools and high-throughput experimental biotechnology has lead to enormous progress in the life science. Among the most important innovations is the microarray tecnology. It allows to quantify the expression for thousands of genes simultaneously by measurin the hybridization from a tissue of interest to probes on a small glass or plastic slide. The characteristics of these data include a fair amount of random noise, a predictor dimension in the thousand, and a sample noise in the dozens. One of the most exciting areas to which microarray technology has been applied is the challenge of deciphering complex disease such as cancer. In these studies, samples are taken from two or more groups of individuals with heterogeneous phenotypes, pathologies, or clinical outcomes. these samples are hybridized to microarrays in an effort to find a small number of genes which are strongly correlated with the group of individuals. Eventhough today methods to analyse the data are welle developed and close to reach a standard organization (through the effort of preposed International project like Microarray Gene Expression Data -MGED- Society [1]) it is not unfrequant to stumble in a clinician's question that do not have a compelling statistical method that could permit to answer it.The contribution of this dissertation in deciphering disease regards the development of new approaches aiming at handle open problems posed by clinicians in handle specific experimental designs. In Chapter 1 starting from a biological necessary introduction, we revise the microarray tecnologies and all the important steps that involve an experiment from the production of the array, to the quality controls ending with preprocessing steps that will be used into the data analysis in the rest of the dissertation. While in Chapter 2 a critical review of standard analysis methods are provided stressing most of problems that In Chapter 3 is introduced a method to adress the issue of unbalanced design of miacroarray experiments. In microarray experiments, experimental design is a crucial starting-point for obtaining reasonable results. In a two-class problem, an equal or similar number of samples it should be collected between the two classes. However in some cases, e.g. rare pathologies, the approach to be taken is less evident. We propose to address this issue by applying a modified version of SAM [2]. MultiSAM consists in a reiterated application of a SAM analysis, comparing the less populated class (LPC) with 1,000 random samplings of the same size from the more populated class (MPC) A list of the differentially expressed genes is generated for each SAM application. After 1,000 reiterations, each single probe given a "score" ranging from 0 to 1,000 based on its recurrence in the 1,000 lists as differentially expressed. The performance of MultiSAM was compared to the performance of SAM and LIMMA [3] over two simulated data sets via beta and exponential distribution. The results of all three algorithms over low- noise data sets seems acceptable However, on a real unbalanced two-channel data set reagardin Chronic Lymphocitic Leukemia, LIMMA finds no significant probe, SAM finds 23 significantly changed probes but cannot separate the two classes, while MultiSAM finds 122 probes with score >300 and separates the data into two clusters by hierarchical clustering. We also report extra-assay validation in terms of differentially expressed genes Although standard algorithms perform well over low-noise simulated data sets, multi-SAM seems to be the only one able to reveal subtle differences in gene expression profiles on real unbalanced data. In Chapter 4 a method to adress similarities evaluation in a three-class prblem by means of Relevance Vector Machine [4] is described. In fact, looking at microarray data in a prognostic and diagnostic clinical framework, not only differences could have a crucial role. In some cases similarities can give useful and, sometimes even more, important information. The goal, given three classes, could be to establish, with a certain level of confidence, if the third one is similar to the first or the second one. In this work we show that Relevance Vector Machine (RVM) [2] could be a possible solutions to the limitation of standard supervised classification. In fact, RVM offers many advantages compared, for example, with his well-known precursor (Support Vector Machine - SVM [3]). Among these advantages, the estimate of posterior probability of class membership represents a key feature to address the similarity issue. This is a highly important, but often overlooked, option of any practical pattern recognition system. We focused on Tumor-Grade-three-class problem, so we have 67 samples of grade I (G1), 54 samples of grade 3 (G3) and 100 samples of grade 2 (G2). The goal is to find a model able to separate G1 from G3, then evaluate the third class G2 as test-set to obtain the probability for samples of G2 to be member of class G1 or class G3. The analysis showed that breast cancer samples of grade II have a molecular profile more similar to breast cancer samples of grade I. Looking at the literature this result have been guessed, but no measure of significance was gived before.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mixed integer programming is up today one of the most widely used techniques for dealing with hard optimization problems. On the one side, many practical optimization problems arising from real-world applications (such as, e.g., scheduling, project planning, transportation, telecommunications, economics and finance, timetabling, etc) can be easily and effectively formulated as Mixed Integer linear Programs (MIPs). On the other hand, 50 and more years of intensive research has dramatically improved on the capability of the current generation of MIP solvers to tackle hard problems in practice. However, many questions are still open and not fully understood, and the mixed integer programming community is still more than active in trying to answer some of these questions. As a consequence, a huge number of papers are continuously developed and new intriguing questions arise every year. When dealing with MIPs, we have to distinguish between two different scenarios. The first one happens when we are asked to handle a general MIP and we cannot assume any special structure for the given problem. In this case, a Linear Programming (LP) relaxation and some integrality requirements are all we have for tackling the problem, and we are ``forced" to use some general purpose techniques. The second one happens when mixed integer programming is used to address a somehow structured problem. In this context, polyhedral analysis and other theoretical and practical considerations are typically exploited to devise some special purpose techniques. This thesis tries to give some insights in both the above mentioned situations. The first part of the work is focused on general purpose cutting planes, which are probably the key ingredient behind the success of the current generation of MIP solvers. Chapter 1 presents a quick overview of the main ingredients of a branch-and-cut algorithm, while Chapter 2 recalls some results from the literature in the context of disjunctive cuts and their connections with Gomory mixed integer cuts. Chapter 3 presents a theoretical and computational investigation of disjunctive cuts. In particular, we analyze the connections between different normalization conditions (i.e., conditions to truncate the cone associated with disjunctive cutting planes) and other crucial aspects as cut rank, cut density and cut strength. We give a theoretical characterization of weak rays of the disjunctive cone that lead to dominated cuts, and propose a practical method to possibly strengthen those cuts arising from such weak extremal solution. Further, we point out how redundant constraints can affect the quality of the generated disjunctive cuts, and discuss possible ways to cope with them. Finally, Chapter 4 presents some preliminary ideas in the context of multiple-row cuts. Very recently, a series of papers have brought the attention to the possibility of generating cuts using more than one row of the simplex tableau at a time. Several interesting theoretical results have been presented in this direction, often revisiting and recalling other important results discovered more than 40 years ago. However, is not clear at all how these results can be exploited in practice. As stated, the chapter is a still work-in-progress and simply presents a possible way for generating two-row cuts from the simplex tableau arising from lattice-free triangles and some preliminary computational results. The second part of the thesis is instead focused on the heuristic and exact exploitation of integer programming techniques for hard combinatorial optimization problems in the context of routing applications. Chapters 5 and 6 present an integer linear programming local search algorithm for Vehicle Routing Problems (VRPs). The overall procedure follows a general destroy-and-repair paradigm (i.e., the current solution is first randomly destroyed and then repaired in the attempt of finding a new improved solution) where a class of exponential neighborhoods are iteratively explored by heuristically solving an integer programming formulation through a general purpose MIP solver. Chapters 7 and 8 deal with exact branch-and-cut methods. Chapter 7 presents an extended formulation for the Traveling Salesman Problem with Time Windows (TSPTW), a generalization of the well known TSP where each node must be visited within a given time window. The polyhedral approaches proposed for this problem in the literature typically follow the one which has been proven to be extremely effective in the classical TSP context. Here we present an overall (quite) general idea which is based on a relaxed discretization of time windows. Such an idea leads to a stronger formulation and to stronger valid inequalities which are then separated within the classical branch-and-cut framework. Finally, Chapter 8 addresses the branch-and-cut in the context of Generalized Minimum Spanning Tree Problems (GMSTPs) (i.e., a class of NP-hard generalizations of the classical minimum spanning tree problem). In this chapter, we show how some basic ideas (and, in particular, the usage of general purpose cutting planes) can be useful to improve on branch-and-cut methods proposed in the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents exact, hybrid algorithms for mixed resource Allocation and Scheduling problems; in general terms, those consist into assigning over time finite capacity resources to a set of precedence connected activities. The proposed methods have broad applicability, but are mainly motivated by applications in the field of Embedded System Design. In particular, high-performance embedded computing recently witnessed the shift from single CPU platforms with application-specific accelerators to programmable Multi Processor Systems-on-Chip (MPSoCs). Those allow higher flexibility, real time performance and low energy consumption, but the programmer must be able to effectively exploit the platform parallelism. This raises interest in the development of algorithmic techniques to be embedded in CAD tools; in particular, given a specific application and platform, the objective if to perform optimal allocation of hardware resources and to compute an execution schedule. On this regard, since embedded systems tend to run the same set of applications for their entire lifetime, off-line, exact optimization approaches are particularly appealing. Quite surprisingly, the use of exact algorithms has not been well investigated so far; this is in part motivated by the complexity of integrated allocation and scheduling, setting tough challenges for ``pure'' combinatorial methods. The use of hybrid CP/OR approaches presents the opportunity to exploit mutual advantages of different methods, while compensating for their weaknesses. In this work, we consider in first instance an Allocation and Scheduling problem over the Cell BE processor by Sony, IBM and Toshiba; we propose three different solution methods, leveraging decomposition, cut generation and heuristic guided search. Next, we face Allocation and Scheduling of so-called Conditional Task Graphs, explicitly accounting for branches with outcome not known at design time; we extend the CP scheduling framework to effectively deal with the introduced stochastic elements. Finally, we address Allocation and Scheduling with uncertain, bounded execution times, via conflict based tree search; we introduce a simple and flexible time model to take into account duration variability and provide an efficient conflict detection method. The proposed approaches achieve good results on practical size problem, thus demonstrating the use of exact approaches for system design is feasible. Furthermore, the developed techniques bring significant contributions to combinatorial optimization methods.