145 resultados para LOPEZ JORDAN
Resumo:
This approach to sustainable design explores the possibility of creating an architectural design process which can iteratively produce optimised and sustainable design solutions. Driven by an evolution process based on genetic algorithms, the system allows the designer to “design the building design generator” rather than to “designs the building”. The design concept is abstracted into a digital design schema, which allows transfer of the human creative vision into the rational language of a computer. The schema is then elaborated into the use of genetic algorithms to evolve innovative, performative and sustainable design solutions. The prioritisation of the project’s constraints and the subsequent design solutions synthesised during design generation are expected to resolve most of the major conflicts in the evaluation and optimisation phases. Mosques are used as the example building typology to ground the research activity. The spatial organisations of various mosque typologies are graphically represented by adjacency constraints between spaces. Each configuration is represented by a planar graph which is then translated into a non-orthogonal dual graph and fed into the genetic algorithm system with fixed constraints and expected performance criteria set to govern evolution. The resultant Hierarchical Evolutionary Algorithmic Design System is developed by linking the evaluation process with environmental assessment tools to rank the candidate designs. The proposed system generates the concept, the seed, and the schema, and has environmental performance as one of the main criteria in driving optimisation.
Resumo:
Over the years, approaches to obesity prevention and treatment have gone from focusing on genetic and other biological factors to exploring a diversity of diets and individual behavior modification interventions anchored primarily in the power of the mind, to the recent shift focusing on societal interventions to design ";temptation-proof"; physical, social, and economic environments. In spite of repeated calls to action, including those of the World Health Organization (WHO), the pandemic continues to progress. WHO recently projected that if the current lifestyle trend in young and adult populations around the world persist, by 2012 in countries like the USA, health care costs may amount to as much as 17.7% of the GDP. Most importantly, in large part due to the problems of obesity, those children may be the first generation ever to have a shorter life expectancy than that of their parents. This work presents the most current research and proposals for addressing the pandemic. Past studies have focused primarly on either genetic or behavioral causes for obesity, however today's research indicates that a strongly integrated program is the best prospect for success in overcoming obesity. Furthermore, focus on the role of society in establishing an affordable, accessible and sustainable program for implementing these lifestyle changes is vital, particularly for those in economically challenged situations, who are ultimately at the highest risk for obesity. Using studies from both neuroscience and behavioral science to present a comprehensive overview of the challenges and possible solutions, The brain-to-society approach to obesity prevention focuses on what is needed in order to sustain a healthy, pleasurable and affordable lifestyle.
Resumo:
Over many centuries of settlement, Vietnamese inhabitants have developed a vernacular architecture that is well adapted to the region’s climatic and topographical conditions. Vernacular Vietnamese housing uses natural systems to create a built environment that integrates well with nature. The vernacular combines site-sensitive, passive solar design, natural materials and appropriate structure to achieve harmony among nature, humans and the built environment. Unfortunately, these unique features have not been applied in contemporary Vietnamese architecture, which displays energy-intensive materials and built forms. This research is analysing how environmentally-responsive elements of vernacular architecture could be applied to modern sustainable housing in Vietnam. Elements of many types of vernacular architecture throughout the country are reviewed as precedents for future building planning and design. The paper also looks at culturally and ecologically appropriate legislative and voluntary options for encouraging more sustainable housing.
Resumo:
In this thesis, I advance the understanding of information technology (IT) governance research and corporate governance research by considering the question “How do boards govern IT?” The importance of IT to business has increased over the last decade, but there has been little academic research which has focused on boards and their role in the governance of IT (Van Grembergen, De Haes and Guldentops, 2004). Most of the research on information technology governance (ITG) has focused on advancing the understanding and measurement of the components of the ITG model (Buckby, Best & Stewart, 2008; Wilkin & Chenhall, 2010), a model recommended by the IT Governance Institute (2003) as ‘best practice’ for boards to use in governing IT. IT governance is considered to be the responsibility of the board and is said to form an important subset of an organisation’s corporate governance processes (Borth & Bradley, 2008). Boards need to govern IT as a result of the large capital investment in IT resources and high dependency on IT by organisations. Van Grembergen, De Haes and Guldentops (2004) and De Haes & Van Grembergen (2009) indicate that corporate governance matters are not able to be effectively discharged unless IT is being governed properly, and call for further specific research on the role of the board in ITG. Researchers also indicate that the link between corporate governance and IT governance has been neglected (Borth & Bradley, 2008; Musson & Jordan, 2005; Bhattacharjya & Chang, 2008). This thesis will address this gap in the ITG literature by providing the bridge between the ITG and corporate governance literatures. My thesis uses a critical realist epistemology and a mixed method approach to gather insights into my research question. In the first phase of my research I develop a survey instrument to assess whether boards consider the components of the ITG model in governing IT. The results of this first study indicated that directors do not conceptualise their role in governing IT using the elements of the ITG model. Thus, I moved to focus on whether prominent corporate governance theories might elucidate how boards govern IT. In the second phase of the research, I used a qualitative inductive case based study to assess whether agency, stewardship and resource dependence theories explain how boards govern IT in Australian universities. As the first in-depth study of university IT governance processes, my research contributes to the ITG research field by revealing that Australian university board governance of IT is characterized by a combination of agency theory and stewardship theory behaviours and processes. The study also identified strong links between a university’s IT structure and evidence of agency and stewardship theories. This link provides insight into the structures element of the emerging enterprise governance of IT framework (Van Grembergen, De Haes & Guldentops, 2004; De Haes & Van Grembergen, 2009; Van Grembergen & De Haes, 2009b; Ko & Fink, 2010). My research makes an important contribution to governance research by identifying a key link between corporate and ITG literatures and providing insight into board IT governance processes. The research conducted in my thesis should encourage future researchers to continue to explore the links between corporate and IT governance research.
Resumo:
Many of the classification algorithms developed in the machine learning literature, including the support vector machine and boosting, can be viewed as minimum contrast methods that minimize a convex surrogate of the 0–1 loss function. The convexity makes these algorithms computationally efficient. The use of a surrogate, however, has statistical consequences that must be balanced against the computational virtues of convexity. To study these issues, we provide a general quantitative relationship between the risk as assessed using the 0–1 loss and the risk as assessed using any nonnegative surrogate loss function. We show that this relationship gives nontrivial upper bounds on excess risk under the weakest possible condition on the loss function—that it satisfies a pointwise form of Fisher consistency for classification. The relationship is based on a simple variational transformation of the loss function that is easy to compute in many applications. We also present a refined version of this result in the case of low noise, and show that in this case, strictly convex loss functions lead to faster rates of convergence of the risk than would be implied by standard uniform convergence arguments. Finally, we present applications of our results to the estimation of convergence rates in function classes that are scaled convex hulls of a finite-dimensional base class, with a variety of commonly used loss functions.
Resumo:
Kernel-based learning algorithms work by embedding the data into a Euclidean space, and then searching for linear relations among the embedded data points. The embedding is performed implicitly, by specifying the inner products between each pair of points in the embedding space. This information is contained in the so-called kernel matrix, a symmetric and positive semidefinite matrix that encodes the relative positions of all points. Specifying this matrix amounts to specifying the geometry of the embedding space and inducing a notion of similarity in the input space - classical model selection problems in machine learning. In this paper we show how the kernel matrix can be learned from data via semidefinite programming (SDP) techniques. When applied to a kernel matrix associated with both training and test data this gives a powerful transductive algorithm -using the labeled part of the data one can learn an embedding also for the unlabeled part. The similarity between test points is inferred from training points and their labels. Importantly, these learning problems are convex, so we obtain a method for learning both the model class and the function without local minima. Furthermore, this approach leads directly to a convex method for learning the 2-norm soft margin parameter in support vector machines, solving an important open problem.
Resumo:
The support vector machine (SVM) has played an important role in bringing certain themes to the fore in computationally oriented statistics. However, it is important to place the SVM in context as but one member of a class of closely related algorithms for nonlinear classification. As we discuss, several of the “open problems” identified by the authors have in fact been the subject of a significant literature, a literature that may have been missed because it has been aimed not only at the SVM but at a broader family of algorithms. Keeping the broader class of algorithms in mind also helps to make clear that the SVM involves certain specific algorithmic choices, some of which have favorable consequences and others of which have unfavorable consequences—both in theory and in practice. The broader context helps to clarify the ties of the SVM to the surrounding statistical literature.
Resumo:
Kernel-based learning algorithms work by embedding the data into a Euclidean space, and then searching for linear relations among the embedded data points. The embedding is performed implicitly, by specifying the inner products between each pair of points in the embedding space. This information is contained in the so-called kernel matrix, a symmetric and positive definite matrix that encodes the relative positions of all points. Specifying this matrix amounts to specifying the geometry of the embedding space and inducing a notion of similarity in the input space -- classical model selection problems in machine learning. In this paper we show how the kernel matrix can be learned from data via semi-definite programming (SDP) techniques. When applied to a kernel matrix associated with both training and test data this gives a powerful transductive algorithm -- using the labelled part of the data one can learn an embedding also for the unlabelled part. The similarity between test points is inferred from training points and their labels. Importantly, these learning problems are convex, so we obtain a method for learning both the model class and the function without local minima. Furthermore, this approach leads directly to a convex method to learn the 2-norm soft margin parameter in support vector machines, solving another important open problem. Finally, the novel approach presented in the paper is supported by positive empirical results.
Resumo:
Mismanagement of large-scale, complex projects has resulted in spectacular failures, cost overruns, time blowouts, and stakeholder dissatisfaction. We focus discussion on the interaction of key management and leadership attributes which facilitate leaders’ adaptive behaviors. These behaviors should in turn influence adaptive team member behavior, stakeholder engagement and successful project outcomes, outputs and impacts. An understanding of this type of management will benefit from a perspective based in managerial and organizational cognition. The research question we explore is whether successful leaders of large-scale complex projects have an internal process leading to a display of administrative, adaptive, and enabling behaviors that foster adaptive processes and enabling behaviors within their teams and with external stakeholders. At the core of the model we propose interactions of key attributes, namely cognitive flexibility, affect, and emotional intelligence. The result of these cognitive-affective attribute interactions is leadership leading to enhanced likelihood of complex project success.