116 resultados para paradigm shift
Resumo:
Resistance to chemotherapy and molecularly targeted therapies is a major problem facing current cancer research. The mechanisms of resistance to 'classical' cytotoxic chemotherapeutics and to therapies that are designed to be selective for specific molecular targets share many features, such as alterations in the drug target, activation of prosurvival pathways and ineffective induction of cell death. With the increasing arsenal of anticancer agents, improving preclinical models and the advent of powerful high-throughput screening techniques, there are now unprecedented opportunities to understand and overcome drug resistance through the clinical assessment of rational therapeutic drug combinations and the use of predictive biomarkers to enable patient stratification.
Resumo:
Life science research aims to continuously improve the quality and standard of human life. One of the major challenges in this area is to maintain food safety and security. A number of image processing techniques have been used to investigate the quality of food products. In this paper,we propose a new algorithm to effectively segment connected grains so that each of them can be inspected in a later processing stage. One family of the existing segmentation methods is based on the idea of watersheding, and it has shown promising results in practice.However,due to the over-segmentation issue,this technique has experienced poor performance in various applications,such as inhomogeneous background and connected targets. To solve this problem,we present a combination of two classical techniques to handle this issue.In the first step,a mean shift filter is used to eliminate the inhomogeneous background, where entropy is used to be a converging criterion. Secondly,a color gradient algorithm is used in order to detect the most significant edges, and a marked watershed transform is applied to segment cluttered objects out of the previous processing stages. The proposed framework is capable of compromising among execution time, usability, efficiency and segmentation outcome in analyzing ring die pellets. The experimental results demonstrate that the proposed approach is effectiveness and robust.
Resumo:
Human listeners seem to be remarkably able to recognise acoustic sound sources based on timbre cues. Here we describe a psychophysical paradigm to estimate the time it takes to recognise a set of complex sounds differing only in timbre cues: both in terms of the minimum duration of the sounds and the inferred neural processing time. Listeners had to respond to the human voice while ignoring a set of distractors. All sounds were recorded from natural sources over the same pitch range and equalised to the same duration and power. In a first experiment, stimuli were gated in time with a raised-cosine window of variable duration and random onset time. A voice/non-voice (yes/no) task was used. Performance, as measured by d', remained above chance for the shortest sounds tested (2 ms); d's above 1 were observed for durations longer than or equal to 8 ms. Then, we constructed sequences of short sounds presented in rapid succession. Listeners were asked to report the presence of a single voice token that could occur at a random position within the sequence. This method is analogous to the "rapid sequential visual presentation" paradigm (RSVP), which has been used to evaluate neural processing time for images. For 500-ms sequences made of 32-ms and 16-ms sounds, d' remained above chance for presentation rates of up to 30 sounds per second. There was no effect of the pitch relation between successive sounds: identical for all sounds in the sequence or random for each sound. This implies that the task was not determined by streaming or forward masking, as both phenomena would predict better performance for the random pitch condition. Overall, the recognition of familiar sound categories such as the voice seems to be surprisingly fast, both in terms of the acoustic duration required and of the underlying neural time constants.
Resumo:
From the early 1900s, some psychologists have attempted to establish their discipline as a quantitative science. In using quantitative methods to investigate their theories, they adopted their own special definition of measurement of attributes such as cognitive abilities, as though they were quantities of the type encountered in Newtonian science. Joel Michell has presented a carefully reasoned argument that psychological attributes lack additivity, and therefore cannot be quantities in the same way as the attributes of classical Newtonian physics. In the early decades of the 20th century, quantum theory superseded Newtonian mechanics as the best model of physical reality. This paper gives a brief, critical overview of the evolution of current measurement practices in psychology, and suggests the need for a transition from a Newtonian to a quantum theoretical paradigm for psychological measurement. Finally, a case study is presented that considers the implications of a quantum theoretical model for educational measurement. In particular, it is argued that, since the OECD’s Programme for International Student Assessment (PISA) is predicated on a Newtonian conception of measurement, this may constrain the extent to which it can make accurate comparisons of the achievements of different education systems.
Resumo:
This study is the first to compare random regret minimisation (RRM) and random utility maximisation (RUM) in freight transport application. This paper aims to compare RRM and RUM in a freight transport scenario involving negative shock in the reference alternative. Based on data from two stated choice experiments conducted among Swiss logistics managers, this study contributes to related literature by exploring for the first time the use of mixed logit models in the most recent version of the RRM approach. We further investigate two paradigm choices by computing elasticities and forecasting choice probability. We find that regret is important in describing the managers’ choices. Regret increases in the shock scenario, supporting the idea that a shift in reference point can cause a shift towards regret minimisation. Differences in elasticities and forecast probability are identified and discussed appropriately.
Resumo:
The outcomes of educational assessments undoubtedly have real implications for students, teachers, schools and education in the widest sense. Assessment results are, for example, used to award qualifications that determine future educational or vocational pathways of students. The results obtained by students in assessments are also used to gauge individual teacher quality, to hold schools to account for the standards achieved by their students, and to compare international education systems. Given the current high-stakes nature of educational assessment, it is imperative that the measurement practices involved have stable philosophical foundations. However, this paper casts doubt on the theoretical underpinnings of contemporary educational measurement models. Aspects of Wittgenstein’s later philosophy and Bohr’s philosophy of quantum theory are used to argue that a quantum theoretical rather than a Newtonian model is appropriate for educational measurement, and the associated implications for the concept of validity are elucidated. Whilst it is acknowledged that the transition to a quantum theoretical framework would not lead to the demise of educational assessment, it is argued that, where practical, current high-stakes assessments should be reformed to become as ‘low-stakes’ as possible. The paper also undermines some of the pro high-stakes testing rhetoric that has a tendency to afflict education.
Resumo:
Heterogeneous Networks (HetNets) are known to enhance the bandwidth efficiency and throughput of wireless networks by more effectively utilizing the network resources. However, the higher density of users and access points in HetNets introduces significant inter-user interference that needs to be mitigated through complex and sophisticated interference cancellation schemes. Moreover, due to significant channel attenuation and presence of hardware impairments, e.g., phase noise and amplifier nonlinearities, the vast bandwidth in the millimeter-wave band has not been fully utilized to date. In order to enable the development of multi-Gigabit per second wireless networks, we introduce a novel millimeter-wave HetNet paradigm, termed hybrid HetNet, which exploits the vast bandwidth and propagation characteristics in the 60 GHz and 70–80 GHz bands to reduce the impact of interference in HetNets. Simulation results are presented to illustrate the performance advantage of hybrid HetNets with respect to traditional networks. Next, two specific transceiver structures that enable hand-offs from the 60 GHz band, i.e., the V-band to the 70–80 GHz band, i.e., the E-band, and vice versa are proposed. Finally, the practical and regulatory challenges for establishing a hybrid HetNet are outlined.
Resumo:
Next-generation sequencing (NGS) is beginning to show its full potential for diagnostic and therapeutic applications. In particular, it is enunciating its capacity to contribute to a molecular taxonomy of cancer, to be used as a standard approach for diagnostic mutation detection, and to open new treatment options that are not exclusively organ-specific. If this is the case, how much validation is necessary and what should be the validation strategy, when bringing NGS into the diagnostic/clinical practice? This validation strategy should address key issues such as: what is the overall extent of the validation? Should essential indicators of test performance such as sensitivity of specificity be calculated for every target or sample type? Should bioinformatic interpretation approaches be validated with the same rigour? What is a competitive clinical turnaround time for a NGS-based test, and when does it become a cost-effective testing proposition? While we address these and other related topics in this commentary, we also suggest that a single set of international guidelines for the validation and use of NGS technology in routine diagnostics may allow us all to make a much more effective use of resources.
Resumo:
A spectroscopic study of the He-alpha (1s(2) S-1(0) - 1s2p P-1(1)) line emission (4749.73 eV) from high density plasma was conducted. The plasma was produced by irradiating Ti targets with intense (I approximate to 1x10(19) W/cm(2)), 400nm wavelength high contrast, short (45fs) p-polarized laser pulses at an angle of 45 degrees. A line shift up to 3.4 +/- 1.0 eV (1.9 +/- 0.55 m angstrom) was observed in the He-alpha line. The line width of the resonance line at FWHM was measured to be 12.1 +/- 0.6 eV (6.7 +/- 0.35 m angstrom). For comparison, we looked into the emission of the same spectral line from plasma produced by irradiating the same target with laser pulses of reduced intensities (approximate to 10(17) W/cm(2)): we observed a spectral shift of only 1.8 +/- 1.0 eV (0.9 +/- 0.55m angstrom) and the line-width measures up to 5.8 +/- 0.25 eV (2.7 +/- 0.35 m angstrom). These data provide evidence of plasma polarization shift of the Ti He-alpha line.