971 resultados para Absolute generality


Relevância:

10.00% 10.00%

Publicador:

Resumo:

X + Y has attitude, enigma and re-establishes a venue drenched in a Valley persona etched over seven decades and surrounded by bars of a new generation. This is a bar without judgement, dress codes and without absolute refinement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: To investigate the interocular symmetry of optical, biometric and biomechanical characteristics between the fellow eyes of myopic anisometropes. Methods: Thirty-four young, healthy myopic anisometropic adults (≥ 1 D spherical equivalent difference between eyes) without amblyopia or strabismus were recruited. A range of biometric and optical parameters were measured in both eyes of each subject including; axial length, ocular aberrations, intraocular pressure (IOP), corneal topography and biomechanics. Ocular sighting dominance was also measured. Results: Mean absolute spherical equivalent anisometropia was 1.70 ± 0.74 D and there was a strong correlation between the degree of anisometropia and the interocular difference in axial length (r = 0.81, p < 0.001). The more and less myopic eyes displayed a high degree of interocular symmetry for the majority of biometric, biomechanical and optical parameters measured. When the level of anisometropia exceeded 1.75 D, the more myopic eye was more likely to be the dominant sighting eye than for lower levels of anisometropia (p=0.002). Subjects with greater levels of anisometropia (> 1.75 D) also showed high levels of correlation between the dominant and non-dominant eyes in their biometric, biomechanical and optical characteristics. Conclusions: Although significantly different in axial length, anisometropic eyes display a high degree of interocular symmetry for a range of anterior eye biometrics and optical parameters. For higher levels of anisometropia, the more myopic eye tends to be the dominant sighting eye.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Previous research has put forward a number of properties of business process models that have an impact on their understandability. Two such properties are compactness and(block-)structuredness. What has not been sufficiently appreciated at this point is that these desirable properties may be at odds with one another. This paper presents the results of a two-pronged study aimed at exploring the trade-off between compactness and structuredness of process models. The first prong of the study is a comparative analysis of the complexity of a set of unstructured process models from industrial practice and of their corresponding structured versions. The second prong is an experiment wherein a cohort of students was exposed to semantically equivalent unstructured and structured process models. The key finding is that structuredness is not an absolute desideratum vis-a-vis for process model understandability. Instead, subtle trade-offs between structuredness and other model properties are at play.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose - Thermo-magnetic convection and heat transfer of paramagnetic fluid placed in a micro-gravity condition (g = 0) and under a uniform vertical gradient magnetic field in an open square cavity with three cold sidewalls have been studied numerically. Design/methodology/approach - This magnetic force is proportional to the magnetic susceptibility and the gradient of the square of the magnetic induction. The magnetic susceptibility is inversely proportional to the absolute temperature based on Curie’s law. Thermal convection of a paramagnetic fluid can therefore take place even in zero-gravity environment as a direct consequence of temperature differences occurring within the fluid due to a constant internal heat generation placed within a magnetic field gradient. Findings - Effects of magnetic Rayleigh number, Ra, Prandtl number, Pr, and paramagnetic fluid parameter, m, on the flow pattern and isotherms as well as on the heat absorption are presented graphically. It is found that the heat transfer rate is suppressed in increased of the magnetic Rayleigh number and the paramagnetic fluid parameter for the present investigation. Originality/value - It is possible to control the buoyancy force by using the super conducting magnet. To the best knowledge of the author no literature related to magnetic convection for this configuration is available.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Female genital mutilation (FGM) is a cultural practice common in many Islamic societies. It involves the deliberate, non-therapeutic physical modification of young girls’ genitalia. FGM can take several forms, ranging from less damaging incisions to actual removal of genitalia and narrowing or even closing of the vagina. While often thought to be required by religion, FGM both predates and has no basis in the Koran. Rather, it is a cultural tradition, motivated by a patriarchal social desire to control female bodies to ensure virginity at marriage (preserving family honour), and to prevent infidelity by limiting sexual desire. In the USA and Australia in 2010, peak medical bodies considered endorsing the medical administration of a ‘lesser’ form of FGM. The basis for this was pragmatic: it would be preferable to satisfy patients’ desire for FGM in medically-controlled conditions, rather than have these patients seek it, possibly in more severe forms, under less safe conditions. While arguments favouring medically-administered FGM were soon overcome, the prospect of endorsing FGM illuminated the issue in these two Western countries and beyond. This paper will review the nature of FGM, its physical and psychological health consequences, and Australian laws prohibiting FGM. Then, it will scan recent developments in Africa, where FGM has been made illegal by a growing number of nations and by the Protocol to the African Charter on Human and Peoples’ Rights 2003 (the Maputo Protocol), but is still proving difficult to eradicate. Finally, based on arguments derived from theories of rights, health evidence, and the historical and religious contexts, this paper will ask whether an absolute human right against FGM can be developed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Characteristics of the road infrastructure affect both the popularity of bicycling and its safety, but comparisons of the safety performance of infrastructure may be confounded by differences in the profiles of cyclists who use them. Data from a survey of 2,532 adult bicycle riders in Queensland, Australia, demonstrated that many riders rode reluctantly in particular locations and that preference for riding location was influenced by degree of experience and riding purpose. Most riders rode most often and furthest per week on urban roads, but approximately one-third of all riders (and more new riders) rode there reluctantly. Almost two-thirds of riders rode on bicycle paths, most by choice, not reluctantly. New riders rode proportionally more on bicycle paths, but continuing riders rode further in absolute terms. Utilitarian riders were more likely to ride on bicycle paths than social and fitness riders and almost all of this riding was by choice. Fitness riders were more reluctant in their use of bicycle paths, but still most of their use was by choice. One-third of the respondents reported riding on the sidewalk (legal in Queensland), with approximately two-thirds doing so reluctantly. The frequency and distance ridden on the sidewalk was less than for urban roads and bicycle paths. Sidewalks and bicycle paths were important facilities for both inexperienced and experienced riders and for utilitarian riding, especially when urban roads were considered a poor choice for cycling.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: The Cobb technique is the universally accepted method for measuring the severity of spinal deformities. Traditionally, Cobb angles have been measured using protractor and pencil on hardcopy radiographic films. The new generation of mobile phones make accurate angle measurement possible using an integrated accelerometer, providing a potentially useful clinical tool for assessing Cobb angles. The purpose of this study was to compare Cobb angle measurements performed using an Apple iPhone and traditional protractor in a series of twenty Adolescent Idiopathic Scoliosis patients. Methods: Seven observers measured major Cobb angles on twenty pre-operative postero-anterior radiographs of Adolescent Idiopathic Scoliosis patients with both a standard protractor and using an Apple iPhone. Five of the observers repeated the measurements at least a week after the original measurements. Results: The mean absolute difference between pairs of iPhone/protractor measurements was 2.1°, with a small (1°) bias toward lower Cobb angles with the iPhone. 95% confidence intervals for intra-observer variability were ±3.3° for the protractor and ±3.9° for the iPhone. 95% confidence intervals for inter-observer variability were ±8.3° for the iPhone and ±7.1° for the protractor. Both of these confidence intervals were within the range of previously published Cobb measurement studies. Conclusions: We conclude that the iPhone is an equivalent Cobb measurement tool to the manual protractor, and measurement times are about 15% less. The widespread availability of inclinometer-equipped mobile phones and the ability to store measurements in later versions of the angle measurement software may make these new technologies attractive for clinical measurement applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective quantification of three-dimensional kinematics during different functional and occupational tasks is now more in demand than ever. The introduction of new generation of low-cost passive motion capture systems from a number of manufacturers has made this technology accessible for teaching, clinical practice and in small/medium industry. Despite the attractive nature of these systems, their accuracy remains unproved in independent tests. We assessed static linear accuracy, dynamic linear accuracy and compared gait kinematics from a Vicon MX20 system to a Natural Point OptiTrack system. In all experiments data were sampled simultaneously. We identified both systems perform excellently in linear accuracy tests with absolute errors not exceeding 1%. In gait data there was again strong agreement between the two systems in sagittal and coronal plane kinematics. Transverse plane kinematics differed by up to 3 at the knee and hip, which we attributed to the impact of soft tissue artifact accelerations on the data. We suggest that low-cost systems are comparably accurate to their high-end competitors and offer a platform with accuracy acceptable in research for laboratories with a limited budget.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this study was to determine the effects of cryotherapy, in the form of cold water immersion, on knee joint position sense. Fourteen healthy volunteers, with no previous knee injury or pre-existing clinical condition, participated in this randomized cross-over trial. The intervention consisted of a 30-min immersion, to the level of the umbilicus, in either cold (14 ± 1°C) or tepid water(28 ± 1°C). Approximately one week later, in a randomized fashion, the volunteers completed the remaining immersion. Active ipsilateral limb repositioning sense of the right knee was measured, using weight-bearing and non-weight bearing assessments, employing video-recorded 3D motion analysis. These assessments were conducted immediately before and after a cold and tepid water immersion. No significant differences were found between treatments for the absolute (P = 0.29), relative (P = 0.21) or variable error (P = 0.86). The average effect size of the outcome measures was modest (range –0.49 to 0.9) and all the associated 95% confidence intervals for these effect sizes crossed zero. These results indicate that there is no evidence of an enhanced risk of injury, following a return to sporting activity, after cold water.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Australasian marsupials include three major radiations, the insectivorous/carnivorous Dasyuromorphia, the omnivorous bandicoots (Peramelemorphia), and the largely herbivorous diprotodontians. Morphologists have generally considered the bandicoots and diprotodontians to be closely related, most prominently because they are both syndactylous (with the 2nd and 3rd pedal digits being fused). Molecular studies have been unable to confirm or reject this Syndactyla hypothesis. Here we present new mitochondrial (mt) genomes from a spiny bandicoot (Echymipera rufescens) and two dasyurids, a fat-tailed dunnart (Sminthopsis crassicaudata) and a northern quoll (Dasyurus hallucatus). By comparing trees derived from pairwise base-frequency differences between taxa with standard (absolute, uncorrected) distance trees, we infer that composition bias among mt protein-coding and RNA sequences is sufficient to mislead tree reconstruction. This can explain incongruence between trees obtained from mt and nuclear data sets. However, after excluding major sources of compositional heterogeneity, both the “reduced-bias” mt and nuclear data sets clearly favor a bandicoot plus dasyuromorphian association, as well as a grouping of kangaroos and possums (Phalangeriformes) among diprotodontians. Notably, alternatives to these groupings could only be confidently rejected by combining the mt and nuclear data. Elsewhere on the tree, Dromiciops appears to be sister to the monophyletic Australasian marsupials, whereas the placement of the marsupial mole (Notoryctes) remains problematic. More generally, we contend that it is desirable to combine mt genome and nuclear sequences for inferring vertebrate phylogeny, but as separately modeled process partitions. This strategy depends on detecting and excluding (or accounting for) major sources of nonhistorical signal, such as from compositional nonstationarity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The opening phrase of the title is from Charles Darwin’s notebooks (Schweber 1977). It is a double reminder, firstly that mainstream evolutionary theory is not just about describing nature but is particularly looking for mechanisms or ‘causes’, and secondly, that there will usually be several causes affecting any particular outcome. The second part of the title is our concern at the almost universal rejection of the idea that biological mechanisms are sufficient for macroevolutionary changes, thus rejecting a cornerstone of Darwinian evolutionary theory. Our primary aim here is to consider ways of making it easier to develop and to test hypotheses about evolution. Formalizing hypotheses can help generate tests. In an absolute sense, some of the discussion by scientists about evolution is little better than the lack of reasoning used by those advocating intelligent design. Our discussion here is in a Popperian framework where science is defined by that area of study where it is possible, in principle, to find evidence against hypotheses – they are in principle falsifiable. However, with time, the boundaries of science keep expanding. In the past, some aspects of evolution were outside the current boundaries of falsifiable science, but increasingly new techniques and ideas are expanding the boundaries of science and it is appropriate to re-examine some topics. It often appears that over the last few decades there has been an increasingly strong assumption to look first (and only) for a physical cause. This decision is virtually never formally discussed, just an assumption is made that some physical factor ‘drives’ evolution. It is necessary to examine our assumptions much more carefully. What is meant by physical factors ‘driving’ evolution, or what is an ‘explosive radiation’. Our discussion focuses on two of the six mass extinctions, the fifth being events in the Late Cretaceous, and the sixth starting at least 50,000 years ago (and is ongoing). Cretaceous/Tertiary boundary; the rise of birds and mammals. We have had a long-term interest (Cooper and Penny 1997) in designing tests to help evaluate whether the processes of microevolution are sufficient to explain macroevolution. The real challenge is to formulate hypotheses in a testable way. For example the numbers of lineages of birds and mammals that survive from the Cretaceous to the present is one test. Our first estimate was 22 for birds, and current work is tending to increase this value. This still does not consider lineages that survived into the Tertiary, and then went extinct later. Our initial suggestion was probably too narrow in that it lumped four models from Penny and Phillips (2004) into one model. This reduction is too simplistic in that we need to know about survival and ecological and morphological divergences during the Late Cretaceous, and whether Crown groups of avian or mammalian orders may have existed back into the Cretaceous. More recently (Penny and Phillips 2004) we have formalized hypotheses about dinosaurs and pterosaurs, with the prediction that interactions between mammals (and groundfeeding birds) and dinosaurs would be most likely to affect the smallest dinosaurs, and similarly interactions between birds and pterosaurs would particularly affect the smaller pterosaurs. There is now evidence for both classes of interactions, with the smallest dinosaurs and pterosaurs declining first, as predicted. Thus, testable models are now possible. Mass extinction number six: human impacts. On a broad scale, there is a good correlation between time of human arrival, and increased extinctions (Hurles et al. 2003; Martin 2005; Figure 1). However, it is necessary to distinguish different time scales (Penny 2005) and on a finer scale there are still large numbers of possibilities. In Hurles et al. (2003) we mentioned habitat modification (including the use of Geogenes III July 2006 31 fire), introduced plants and animals (including kiore) in addition to direct predation (the ‘overkill’ hypothesis). We need also to consider prey switching that occurs in early human societies, as evidenced by the results of Wragg (1995) on the middens of different ages on Henderson Island in the Pitcairn group. In addition, the presence of human-wary or humanadapted animals will affect the distribution in the subfossil record. A better understanding of human impacts world-wide, in conjunction with pre-scientific knowledge will make it easier to discuss the issues by removing ‘blame’. While continued spontaneous generation was accepted universally, there was the expectation that animals continued to reappear. New Zealand is one of the very best locations in the world to study many of these issues. Apart from the marine fossil record, some human impact events are extremely recent and the remains less disrupted by time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This letter presents a technique to assess the overall network performance of sampled value process buses based on IEC 61850-9-2 using measurements from a single location in the network. The method is based upon the use of Ethernet cards with externally synchronized time stamping, and characteristics of the process bus protocol. The application and utility of the method is demonstrated by measuring latency introduced by Ethernet switches. Network latency can be measured from a single set of captures, rather than comparing source and destination captures. Absolute latency measures will greatly assist the design testing, commissioning and maintenance of these critical data networks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cyclic nitroxide radicals represent promising alternatives to the iodine-based redox mediator commonly used in dye-sensitized solar cells (DSSCs). To date DSSCs with nitroxide-based redox mediators have achieved energy conversion efficiencies of just over 5 % but efficiencies of over 15 % might be achievable, given an appropriate mediator. The efficacy of the mediator depends upon two main factors: it must reversibly undergo one-electron oxidation and it must possess an oxidation potential in a range of 0.600-0.850 V (vs. a standard hydrogen electrode (SHE) in acetonitrile at 25 °C). Herein, we have examined the effect that structural modifications have on the value of the oxidation potential of cyclic nitroxides as well as the reversibility of the oxidation process. These included alterations to the N-containing skeleton (pyrrolidine, piperidine, isoindoline, azaphenalene, etc.), as well as the introduction of different substituents (alkyl-, methoxy-, amino-, carboxy-, etc.) to the ring. Standard oxidation potentials were calculated using high-level ab initio methodology that was demonstrated to be very accurate (with a mean absolute deviation from experimental values of only 16 mV). An optimal value of 1.45 for the electrostatic scaling factor for UAKS radii in acetonitrile solution was obtained. Established trends in the values of oxidation potentials were used to guide molecular design of stable nitroxides with desired E° ox and a number of compounds were suggested for potential use as enhanced redox mediators in DSSCs. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nutrition interventions in the form of both self-management education and individualised diet therapy are considered essential for the long-term management of type 2 diabetes mellitus (T2DM). The measurement of diet is essential to inform, support and evaluate nutrition interventions in the management of T2DM. Barriers inherent within health care settings and systems limit ongoing access to personnel and resources, while traditional prospective methods of assessing diet are burdensome for the individual and often result in changes in typical intake to facilitate recording. This thesis investigated the inclusion of information and communication technologies (ICT) to overcome limitations to current approaches in the nutritional management of T2DM, in particular the development, trial and evaluation of the Nutricam dietary assessment method (NuDAM) consisting of a mobile phone photo/voice application to assess nutrient intake in a free-living environment with older adults with T2DM. Study 1: Effectiveness of an automated telephone system in promoting change in dietary intake among adults with T2DM The effectiveness of an automated telephone system, Telephone-Linked Care (TLC) Diabetes, designed to deliver self-management education was evaluated in terms of promoting dietary change in adults with T2DM and sub-optimal glycaemic control. In this secondary data analysis independent of the larger randomised controlled trial, complete data was available for 95 adults (59 male; mean age(±SD)=56.8±8.1 years; mean(±SD)BMI=34.2±7.0kg/m2). The treatment effect showed a reduction in total fat of 1.4% and saturated fat of 0.9% energy intake, body weight of 0.7 kg and waist circumference of 2.0 cm. In addition, a significant increase in the nutrition self-efficacy score of 1.3 (p<0.05) was observed in the TLC group compared to the control group. The modest trends observed in this study indicate that the TLC Diabetes system does support the adoption of positive nutrition behaviours as a result of diabetes self-management education, however caution must be applied in the interpretation of results due to the inherent limitations of the dietary assessment method used. The decision to use a close-list FFQ with known bias may have influenced the accuracy of reporting dietary intake in this instance. This study provided an example of the methodological challenges experienced with measuring changes in absolute diet using a FFQ, and reaffirmed the need for novel prospective assessment methods capable of capturing natural variance in usual intakes. Study 2: The development and trial of NuDAM recording protocol The feasibility of the Nutricam mobile phone photo/voice dietary record was evaluated in 10 adults with T2DM (6 Male; age=64.7±3.8 years; BMI=33.9±7.0 kg/m2). Intake was recorded over a 3-day period using both Nutricam and a written estimated food record (EFR). Compared to the EFR, the Nutricam device was found to be acceptable among subjects, however, energy intake was under-recorded using Nutricam (-0.6±0.8 MJ/day; p<0.05). Beverages and snacks were the items most frequently not recorded using Nutricam; however forgotten meals contributed to the greatest difference in energy intake between records. In addition, the quality of dietary data recorded using Nutricam was unacceptable for just under one-third of entries. It was concluded that an additional mechanism was necessary to complement dietary information collected via Nutricam. Modifications to the method were made to allow for clarification of Nutricam entries and probing forgotten foods during a brief phone call to the subject the following morning. The revised recording protocol was evaluated in Study 4. Study 3: The development and trial of the NuDAM analysis protocol Part A explored the effect of the type of portion size estimation aid (PSEA) on the error associated with quantifying four portions of 15 single foods items contained in photographs. Seventeen dietetic students (1 male; age=24.7±9.1 years; BMI=21.1±1.9 kg/m2) estimated all food portions on two occasions: without aids and with aids (food models or reference food photographs). Overall, the use of a PSEA significantly reduced mean (±SD) group error between estimates compared to no aid (-2.5±11.5% vs. 19.0±28.8%; p<0.05). The type of PSEA (i.e. food models vs. reference food photograph) did not have a notable effect on the group estimation error (-6.7±14.9% vs. 1.4±5.9%, respectively; p=0.321). This exploratory study provided evidence that the use of aids in general, rather than the type, was more effective in reducing estimation error. Findings guided the development of the Dietary Estimation and Assessment Tool (DEAT) for use in the analysis of the Nutricam dietary record. Part B evaluated the effect of the DEAT on the error associated with the quantification of two 3-day Nutricam dietary records in a sample of 29 dietetic students (2 males; age=23.3±5.1 years; BMI=20.6±1.9 kg/m2). Subjects were randomised into two groups: Group A and Group B. For Record 1, the use of the DEAT (Group A) resulted in a smaller error compared to estimations made without the tool (Group B) (17.7±15.8%/day vs. 34.0±22.6%/day, p=0.331; respectively). In comparison, all subjects used the DEAT to estimate Record 2, with resultant error similar between Group A and B (21.2±19.2%/day vs. 25.8±13.6%/day; p=0.377 respectively). In general, the moderate estimation error associated with quantifying food items did not translate into clinically significant differences in the nutrient profile of the Nutricam dietary records, only amorphous foods were notably over-estimated in energy content without the use of the DEAT (57kJ/day vs. 274kJ/day; p<0.001). A large proportion (89.6%) of the group found the DEAT helpful when quantifying food items contained in the Nutricam dietary records. The use of the DEAT reduced quantification error, minimising any potential effect on the estimation of energy and macronutrient intake. Study 4: Evaluation of the NuDAM The accuracy and inter-rater reliability of the NuDAM to assess energy and macronutrient intake was evaluated in a sample of 10 adults (6 males; age=61.2±6.9 years; BMI=31.0±4.5 kg/m2). Intake recorded using both the NuDAM and a weighed food record (WFR) was coded by three dietitians and compared with an objective measure of total energy expenditure (TEE) obtained using the doubly labelled water technique. At the group level, energy intake (EI) was under-reported to a similar extent using both methods, with the ratio of EI:TEE was 0.76±0.20 for the NuDAM and 0.76±0.17 for the WFR. At the individual level, four subjects reported implausible levels of energy intake using the WFR method, compared to three using the NuDAM. Overall, moderate to high correlation coefficients (r=0.57-0.85) were found across energy and macronutrients except fat (r=0.24) between the two dietary measures. High agreement was observed between dietitians for estimates of energy and macronutrient derived for both the NuDAM (ICC=0.77-0.99; p<0.001) and WFR (ICC=0.82-0.99; p<0.001). All subjects preferred using the NuDAM over the WFR to record intake and were willing to use the novel method again over longer recording periods. This research program explored two novel approaches which utilised distinct technologies to aid in the nutritional management of adults with T2DM. In particular, this thesis makes a significant contribution to the evidence base surrounding the use of PhRs through the development, trial and evaluation of a novel mobile phone photo/voice dietary record. The NuDAM is an extremely promising advancement in the nutritional management of individuals with diabetes and other chronic conditions. Future applications lie in integrating the NuDAM with other technologies to facilitate practice across the remaining stages of the nutrition care process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cities have long held a fascination for people – as they grow and develop, there is a desire to know and understand the intricate interplay of elements that makes cities ‘live’. In part, this is a need for even greater efficiency in urban centres, yet the underlying quest is for a sustainable urban form. In order to make sense of the complex entities that we recognise cities to be, they have been compared to buildings, organisms and more recently machines. However the search for better and more elegant urban centres is hardly new, healthier and more efficient settlements were the aim of Modernism’s rational sub-division of functions, which has been translated into horizontal distribution through zoning, or vertical organisation thought highrise developments. However both of these approaches have been found to be unsustainable, as too many resources are required to maintain this kind or urbanisation and social consequences of either horizontal or vertical isolation must also be considered. From being absolute consumers of resources, of energy and of technology, cities need to change, to become sustainable in order to be more resilient and more efficient in supporting culture, society as well as economy. Our urban centres need to be re-imagined, re-conceptualised and re-defined, to match our changing society. One approach is to re-examine the compartmentalised, mono-functional approach of urban Modernism and to begin to investigate cities like ecologies, where every element supports and incorporates another, fulfilling more than just one function. This manner of seeing the city suggests a framework to guide the re-mixing of urban settlements. Beginning to understand the relationships between supporting elements and the nature of the connecting ‘web’ offers an invitation to investigate the often ignored, remnant spaces of cities. This ‘negative space’ is the residual from which space and place are carved out in the Contemporary city, providing the link between elements of urban settlement. Like all successful ecosystems, cities need to evolve and change over time in order to effectively respond to different lifestyles, development in culture and society as well as to meet environmental challenges. This paper seeks to investigate the role that negative space could have in the reorganisation of the re-mixed city. The space ‘in-between’ is analysed as an opportunity for infill development or re-development which provides to the urban settlement the variety that is a pre-requisite for ecosystem resilience. An analysis of the urban form is suggested as an empirical tool to map the opportunities already present in the urban environment and negative space is evaluated as a key element in achieving a positive development able to distribute diverse environmental and social facilities in the city.