65 resultados para Over Head Line Design


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The authors demonstrate four real-time reactive responses to movement in everyday scenes using an active head/eye platform. They first describe the design and realization of a high-bandwidth four-degree-of-freedom head/eye platform and visual feedback loop for the exploration of motion processing within active vision. The vision system divides processing into two scales and two broad functions. At a coarse, quasi-peripheral scale, detection and segmentation of new motion occurs across the whole image, and at fine scale, tracking of already detected motion takes place within a foveal region. Several simple coarse scale motion sensors which run concurrently at 25 Hz with latencies around 100 ms are detailed. The use of these sensors are discussed to drive the following real-time responses: (1) head/eye saccades to moving regions of interest; (2) a panic response to looming motion; (3) an opto-kinetic response to continuous motion across the image and (4) smooth pursuit of a moving target using motion alone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As in any technology systems, analysis and design issues are among the fundamental challenges in persuasive technology. Currently, the Persuasive Systems Development (PSD) framework is considered to be the most comprehensive framework for designing and evaluation of persuasive systems. However, the framework is limited in terms of providing detailed information which can lead to selection of appropriate techniques depending on the variable nature of users or use over time. In light of this, we propose a model which is intended for analysing and implementing behavioural change in persuasive technology called the 3D-RAB model. The 3D-RAB model represents the three dimensional relationships between attitude towards behaviour, attitude towards change or maintaining a change, and current behaviour, and distinguishes variable levels in a user’s cognitive state. As such it provides a framework which could be used to select appropriate techniques for persuasive technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In situ high resolution aircraft measurements of cloud microphysical properties were made in coordination with ground based remote sensing observations of a line of small cumulus clouds, using Radar and Lidar, as part of the Aerosol Properties, PRocesses And InfluenceS on the Earth's climate (APPRAISE) project. A narrow but extensive line (~100 km long) of shallow convective clouds over the southern UK was studied. Cloud top temperatures were observed to be higher than −8 °C, but the clouds were seen to consist of supercooled droplets and varying concentrations of ice particles. No ice particles were observed to be falling into the cloud tops from above. Current parameterisations of ice nuclei (IN) numbers predict too few particles will be active as ice nuclei to account for ice particle concentrations at the observed, near cloud top, temperatures (−7.5 °C). The role of mineral dust particles, consistent with concentrations observed near the surface, acting as high temperature IN is considered important in this case. It was found that very high concentrations of ice particles (up to 100 L−1) could be produced by secondary ice particle production providing the observed small amount of primary ice (about 0.01 L−1) was present to initiate it. This emphasises the need to understand primary ice formation in slightly supercooled clouds. It is shown using simple calculations that the Hallett-Mossop process (HM) is the likely source of the secondary ice. Model simulations of the case study were performed with the Aerosol Cloud and Precipitation Interactions Model (ACPIM). These parcel model investigations confirmed the HM process to be a very important mechanism for producing the observed high ice concentrations. A key step in generating the high concentrations was the process of collision and coalescence of rain drops, which once formed fell rapidly through the cloud, collecting ice particles which caused them to freeze and form instant large riming particles. The broadening of the droplet size-distribution by collision-coalescence was, therefore, a vital step in this process as this was required to generate the large number of ice crystals observed in the time available. Simulations were also performed with the WRF (Weather, Research and Forecasting) model. The results showed that while HM does act to increase the mass and number concentration of ice particles in these model simulations it was not found to be critical for the formation of precipitation. However, the WRF simulations produced a cloud top that was too cold and this, combined with the assumption of continual replenishing of ice nuclei removed by ice crystal formation, resulted in too many ice crystals forming by primary nucleation compared to the observations and parcel modelling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sampling strategies for monitoring the status and trends in wildlife populations are often determined before the first survey is undertaken. However, there may be little information about the distribution of the population and so the sample design may be inefficient. Through time, as data are collected, more information about the distribution of animals in the survey region is obtained but it can be difficult to incorporate this information in the survey design. This paper introduces a framework for monitoring motile wildlife populations within which the design of future surveys can be adapted using data from past surveys whilst ensuring consistency in design-based estimates of status and trends through time. In each survey, part of the sample is selected from the previous survey sample using simple random sampling. The rest is selected with inclusion probability proportional to predicted abundance. Abundance is predicted using a model constructed from previous survey data and covariates for the whole survey region. Unbiased design-based estimators of status and trends and their variances are derived from two-phase sampling theory. Simulations over the short and long-term indicate that in general more precise estimates of status and trends are obtained using this mixed strategy than a strategy in which all of the sample is retained or all selected with probability proportional to predicted abundance. Furthermore the mixed strategy is robust to poor predictions of abundance. Estimates of status are more precise than those obtained from a rotating panel design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An investigation is presented of a quasi-stationary convective system (QSCS) which occurred over the UK Southwest Peninsula on 21 July 2010. This system was remarkably similar in its location and structure to one which caused devastating flash flooding in the coastal village of Boscastle, Cornwall on 16 August 2004. However, in the 2010 case rainfall accumulations were around four times smaller and no flooding was recorded. The more extreme nature of the Boscastle case is shown to be related to three factors: (1) higher rain rates, associated with a warmer and moister tropospheric column and deeper convective clouds; (2) a more stationary system, due to slower evolution of the large-scale flow; and (3) distribution of the heaviest precipitation over fewer river catchments. Overall, however, the synoptic setting of the two events was broadly similar, suggesting that such conditions favour the development of QSCSs over the Southwest Peninsula. A numerical simulation of the July 2010 event was performed using a 1.5-km grid length configuration of the Met Office Unified Model. This reveals that convection was repeatedly initiated through lifting of low-level air parcels along a quasi-stationary coastal convergence line. Sensitivity tests are used to show that this convergence line was a sea breeze front which temporarily stalled along the coastline due to the retarding influence of an offshore-directed background wind component. Several deficiencies are noted in the 1.5-km model’s representation of the storm system, including delayed convective initiation; however, significant improvements are observed when the grid length is reduced to 500 m. These result in part from an improved representation of the convergence line, which enhances the associated low-level ascent allowing air parcels to more readily reach their level of free convection. The implications of this finding for forecasting convective precipitation are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fiona Ross, Tim Holloway (co-designers) were commissioned by the renowned newspaper and publishing house Anandabazar Patrika (ABP) to design a new low-contrast typeface in a contemporary style for print and screen use in its publications. Ross and Holloway designed ABP's Bengali house typeface (Linotype Bengali - the first digital Bengali font) that has been in daily use in its newspaper since 1982. The design team was augmented by Neelakash Kshetrimayum; OpenType production undertaken by John Hudson. This Bengali typeface is the first fully functional OpenType design for the script. It demonstrates innovative features that resolve problems which hitherto hindered the successful execution of low-contrast Bengali text fonts: this connecting script of over 450 characters has deep verticals, spiralling strokes, wide characters, and intersecting ascenders. The new design has solutions to overcome the necessity to implement wide interlinear spacing and sets more words to the line than has yet been possible. This project therefore combines the use of aesthetic, technical and linguistic skills and is highly visible in newspapers of the largest newspaper group and publishing house in West Bengal in print and on-line. The design and development of Sarkar has positive implications for other non-Latin script designs, just as the Linotype Bengali typeface formed the blueprint for new non-Latin designs three decades ago. Sarkar was released on 31 August 2012 with the launch of Anandabazar Patirka's new newspaper Ebela.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The concepts of on-line transactional processing (OLTP) and on-line analytical processing (OLAP) are often confused with the technologies or models that are used to design transactional and analytics based information systems. This in some way has contributed to existence of gaps between the semantics in information captured during transactional processing and information stored for analytical use. In this paper, we propose the use of a unified semantics design model, as a solution to help bridge the semantic gaps between data captured by OLTP systems and the information provided by OLAP systems. The central focus of this design approach is on enabling business intelligence using not just data, but data with context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context: Emotion regulation is critically disrupted in depression and use of paradigms tapping these processes may uncover essential changes in neurobiology during treatment. In addition, as neuroimaging outcome studies of depression commonly utilize solely baseline and endpoint data – which is more prone to week-to week noise in symptomatology – we sought to use all data points over the course of a six month trial. Objective: To examine changes in neurobiology resulting from successful treatment. Design: Double-blind trial examining changes in the neural circuits involved in emotion regulation resulting from one of two antidepressant treatments over a six month trial. Participants were scanned pretreatment, at 2 months and 6 months posttreatment. Setting: University functional magnetic resonance imaging facility. Participants: 21 patients with Major Depressive Disorder and without other Axis I or Axis II diagnoses and 14 healthy controls. Interventions: Venlafaxine XR (doses up to 300mg) or Fluoxetine (doses up to 80mg). Main Outcome Measure: Neural activity, as measured using functional magnetic resonance imaging during performance of an emotion regulation paradigm as well as regular assessments of symptom severity by the Hamilton Rating Scale for Depression. To utilize all data points, slope trajectories were calculated for rate of change in depression severity as well as rate of change of neural engagement. Results: Those depressed individuals showing the steepest decrease in depression severity over the six months were those individuals showing the most rapid increases in BA10 and right DLPFC activity when regulating negative affect over the same time frame. This relationship was more robust than when using solely the baseline and endpoint data. Conclusions: Changes in PFC engagement when regulating negative affect correlate with changes in depression severity over six months. These results are buttressed by calculating these statistics which are more reliable and robust to week-to-week variation than difference scores.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper considers second kind equations of the form (abbreviated x=y + K2x) in which and the factor z is bounded but otherwise arbitrary so that equations of Wiener-Hopf type are included as a special case. Conditions on a set are obtained such that a generalized Fredholm alternative is valid: if W satisfies these conditions and I − Kz, is injective for each z ε W then I − Kz is invertible for each z ε W and the operators (I − Kz)−1 are uniformly bounded. As a special case some classical results relating to Wiener-Hopf operators are reproduced. A finite section version of the above equation (with the range of integration reduced to [−a, a]) is considered, as are projection and iterated projection methods for its solution. The operators (where denotes the finite section version of Kz) are shown uniformly bounded (in z and a) for all a sufficiently large. Uniform stability and convergence results, for the projection and iterated projection methods, are obtained. The argument generalizes an idea in collectively compact operator theory. Some new results in this theory are obtained and applied to the analysis of projection methods for the above equation when z is compactly supported and k(s − t) replaced by the general kernel k(s,t). A boundary integral equation of the above type, which models outdoor sound propagation over inhomogeneous level terrain, illustrates the application of the theoretical results developed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cross-layer techniques represent efficient means to enhance throughput and increase the transmission reliability of wireless communication systems. In this paper, a cross-layer design of aggressive adaptive modulation and coding (A-AMC), truncated automatic repeat request (T-ARQ), and user scheduling is proposed for multiuser multiple-input-multiple-output (MIMO) maximal ratio combining (MRC) systems, where the impacts of feedback delay (FD) and limited feedback (LF) on channel state information (CSI) are also considered. The A-AMC and T-ARQ mechanism selects the appropriate modulation and coding schemes (MCSs) to achieve higher spectral efficiency while satisfying the service requirement on the packet loss rate (PLR), profiting from the feasibility of using different MCSs to retransmit a packet, which is destined to a scheduled user selected to exploit multiuser diversity and enhance the system's performance in terms of both transmission efficiency and fairness. The system's performance is evaluated in terms of the average PLR, average spectral efficiency (ASE), outage probability, and average packet delay, which are derived in closed form, considering transmissions over Rayleigh-fading channels. Numerical results and comparisons are provided and show that A-AMC combined with T-ARQ yields higher spectral efficiency than the conventional scheme based on adaptive modulation and coding (AMC), while keeping the achieved PLR closer to the system's requirement and reducing delay. Furthermore, the effects of the number of ARQ retransmissions, numbers of transmit and receive antennas, normalized FD, and cardinality of the beamforming weight vector codebook are studied and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using a cross-layer approach, two enhancement techniques applied for adaptive modulation and coding (AMC) with truncated automatic repeat request (T-ARQ) are investigated, namely, aggressive AMC (A-AMC) and constellation rearrangement (CoRe). Aggressive AMC selects the appropriate modulation and coding schemes (MCS) to achieve higher spectral efficiency, profiting from the feasibility of using different MCSs for retransmitting a packet, whereas in the CoRe-based AMC, retransmissions of the same data packet are performed using different mappings so as to provide different degrees of protection to the bits involved, thus achieving mapping diversity gain. The performance of both schemes is evaluated in terms of average spectral efficiency and average packet loss rate, which are derived in closed-form considering transmission over Nakagami-m fading channels. Numerical results and comparisons are provided. In particular, it is shown that A-AMC combined with T-ARQ yields higher spectral efficiency than the AMC-based conventional scheme while keeping the achieved packet loss rate closer to the system's requirement, and that it can achieve larger spectral efficiency objectives than that of the scheme using AMC along with CoRe.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Affymetrix GeneChip (R) arrays are used widely to study transcriptional changes in response to developmental and environmental stimuli. GeneChip (R) arrays comprise multiple 25-mer oligonucleotide probes per gene and retain certain advantages over direct sequencing. For plants, there are several public GeneChip (R) arrays whose probes are localised primarily in 39 exons. Plant whole-transcript (WT) GeneChip (R) arrays are not yet publicly available, although WT resolution is needed to study complex crop genomes such as Brassica, which are typified by segmental duplications containing paralogous genes and/or allopolyploidy. Available sequence data were sampled from the Brassica A and C genomes, and 142,997 gene models identified. The assembled gene models were then used to establish a comprehensive public WT exon array for transcriptomics studies. The Affymetrix GeneChip (R) Brassica Exon 1.0 ST Array is a 5 mu M feature size array, containing 2.4 million 25-base oligonucleotide probes representing 135,201 gene models, with 15 probes per gene distributed among exons. Discrimination of the gene models was based on an E-value cut-off of 1E(-5), with <= 98 sequence identity. The 135 k Brassica Exon Array was validated by quantifying transcriptome differences between leaf and root tissue from a reference Brassica rapa line (R-o-18), and categorisation by Gene Ontologies (GO) based on gene orthology with Arabidopsis thaliana. Technical validation involved comparison of the exon array with a 60-mer array platform using the same starting RNA samples. The 135 k Brassica Exon Array is a robust platform. All data relating to the array design and probe identities are available in the public domain and are curated within the BrassEnsembl genome viewer at http://www.brassica.info/BrassEnsembl/index.html.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A growing segment of Chinese women are willing to spend a high percentage of their income on fashion related products, however there appears to be concern over the quality of Chinese fashion magazines. Concern can be focused in two major issues: i) fashion magazine design, and ii) pictorial and textual distribution of content. This paper investigates how human factors (i.e. social norms and individual differences) influence fashion magazine design/format preferences, and investigates the difference in readership patterns between British and Chinese Women. Our study identifies significant differences between UK and Chinese readership; which has an impact on magazine viewing patterns and content preference.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the UK, architectural design is regulated through a system of design control for the public interest, which aims to secure and promote ‘quality’ in the built environment. Design control is primarily implemented by locally employed planning professionals with political oversight, and independent design review panels, staffed predominantly by design professionals. Design control has a lengthy and complex history, with the concept of ‘design’ offering a range of challenges for a regulatory system of governance. A simultaneously creative and emotive discipline, architectural design is a difficult issue to regulate objectively or consistently, often leading to policy that is regarded highly discretionary and flexible. This makes regulatory outcomes difficult to predict, as approaches undertaken by the ‘agents of control’ can vary according to the individual. The role of the design controller is therefore central, tasked with the responsibility of interpreting design policy and guidance, appraising design quality and passing professional judgment. However, little is really known about what influences the way design controllers approach their task, providing a ‘veil’ over design control, shrouding the basis of their decisions. This research engaged directly with the attitudes and perceptions of design controllers in the UK, lifting this ‘veil’. Using in-depth interviews and Q-Methodology, the thesis explores this hidden element of control, revealing a number of key differences in how controllers approach and implement policy and guidance, conceptualise design quality, and rationalise their evaluations and judgments. The research develops a conceptual framework for agency in design control – this consists of six variables (Regulation; Discretion; Skills; Design Quality; Aesthetics; and Evaluation) and it is suggested that this could act as a ‘heuristic’ instrument for UK controllers, prompting more reflexivity in relation to evaluating their own position, approaches, and attitudes, leading to better practice and increased transparency of control decisions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The energy of the vh9/2 orbital in nuclei above N = 82 drops rapidly in energy relative to the vf7/2 orbital as the occupancy of the πh11/2 orbital increases. These two neutron orbitals become nearly degenerate as the proton drip line is approached. In this work, we have discovered the new nuclides 161Os and 157W, and studied the decays of the proton emitter 160Re in detail. The 161Os and 160Re nuclei were produced in reactions of 290, 300 and 310 MeV 58Ni ions with an isotopically enriched 106Cd target, separated in‐flight using the RITU separator and implanted into the GREAT spectrometer. The 161Os α a decays populated the new nuclide 157W, which decayed by β‐particle emission. The β decay fed the known α‐decaying 1/2+ and 11/2− states in 157Ta, which is consistent with a vf7/2 ground state in 157W. The measured α‐decay energy and half‐life for 161Os correspond to a reduced α‐decay width that is compatible with s‐wave α‐particle emission, implying that its ground state is also a vf7/2 state. Over 7000 160Re nuclei were produced and the γ decays of a new isomeric state feeding the πd3/2 level in 160Re were discovered, but no evidence for the proton or a decay of the expected πh11/2 state could be found. The isomer decays offer a natural explanation for this non‐observation and provides a striking example of the influence of the near degeneracy of the vh9/2 and vf7/2 orbitals on the properties of nuclei in this region.