Past Coast to Coast Seminars - Abstracts

• warning: mysql_get_server_info() [function.mysql-get-server-info]: Can't connect to local MySQL server through socket '/var/lib/mysql/mysql.sock' (2) in /www.irmacs.sfu.ca/httpd/htdocs/includes/database.mysql.inc on line 43.
• warning: mysql_get_server_info() [function.mysql-get-server-info]: A link to the server could not be established in /www.irmacs.sfu.ca/httpd/htdocs/includes/database.mysql.inc on line 43.

Coast to Coast: Seminar Series" "Visual Analytics as a Cognitive Science"

School of Interactive Arts and Technology, Simon Fraser University

Date: Mar 26, 2013
Time: 11:30 - 12:30
Room: ASB10900

Abstract

This talk explores the larger implications of visual analytics-- "the science of analytical reasoning facilitated by interactive visual interfaces"-- for cognitive science and informatics. I will argue that the methods that will advance this new science go beyond those of natural science and engineering, and will require researchers to create a new translational cognitive science of analytic systems. We will begin by building field study methods that characterize human and computational cognitive capabilities as they are used for decision-making in a range of situations. Because findings from field methods do not generalize well, we must then investigate these proposed capabilities in the laboratory. Finally we must build mathematical and computational theories that predict the impact of changes in technology on cognitive processes in technology-rich environments. These methods will only suffice until processing capacity reduces the lag between an analyst's query and a graphical response to a certain level. When the response is generated at the same pace as the sequence of cognitive operations that the analyst performs, human and computational processes become "close coupled". At this point the distinction between processes originating from the mind of the analyst (i.e. a mental representation) versus the computer (i.e. a visualization) become impossible to determine, and the subsystems we will study will seamlessly incorporate natural and artificial processes.

Coast to Coast: Seminar Series "Text Mining, Dreams and Elections"

Dalhousie University

Date: Mar 12, 2013
Time: 11:30 - 12:30
Room: ASB10901

Abstract

In this talk, we will review some of the recent applied text mining work at Dalhousie. We will argue the need for a text representation that would be more linguistically informed than the standard vector model. We will present one such proposal, in which a co-occurrence model takes into account the distribution of words throughout the corpus. We will then show how this representation is successfully applied in the task of categorizing dream descriptions by their emotional valuation (joint work with J. De Koninck and A. Razavi, Ottawa). We will round up the talk with our experience with some of the other text mining techniques used in the analysis of the twitter traffic in the 2012 presidential elections in France and in the US (joint work with LIRMM, France).

About the Speaker: Stan Matwin is a Professor and Canada Research Chair at Dalhousie University, and a Distinguished Professor at the University of Ottawa (on leave). Fellow of ECCAI and CAIAC and an Ontario Champion of Innovation. Internationally recognized for his work in text mining and in applications of Machine Learning, member of Editorial Boards of the leading journals in Machine Learning and Data Mining. Stan Matwin is one of the founders of Distil Interactive Inc. and Devera Logic Inc., and has significant experience and interest in innovation and technology transfer.

Coast to Coast: Seminar Series: "Data Driven Design: New Perspectives on Visual Analytics"

Sara Diamond

Date: Feb 26, 2013
Time: 11:30 - 12:30
Room: ASB10900

Abstract

This talk underscores the importance of design methods and practices in approaching challenges in the representation of big data. The talk will first reference debates regarding the role and nature of aesthetics and the importance of these to perception and insight, providing illustrations of different aesthetic approaches, at times to the same data set. It will further the discussion of insight by considering ways to work with users and data sets that draw from different practices within design. Fundamentally, design and designers need to be part of the visual analytics equation.

Coast to Coast: Seminar Series: "Towards Personal Visual Analytics"

Department of Computer Science, University of Calgary

Date: Jan 15, 2013
Time: 11:30 - 12:30
Room: ASB10900

Abstract

Modern society demands that people manage, communicate, and interact with digital information at an ever-increasing pace. Even though most people want to be informed, all this information is frequently experienced as stress. It is not the information itself that is the problem, but the manner in which we are bombarded with information in forms that are often hard to interpret. How then can we produce interactive visualizations of digital data in a manner that enhances people's cognitive abilities? Ideally, these visualizations would not only present information visually and aesthetically, but provide people with capabilities for manipulating and exploring this information. A good visualization provokes interpretation, exploration and appreciation, inviting direct interaction that reveals the data.

This sets the stage for my over-arching research goal - to design, develop, and evaluate interactive visualizations so that they support the everyday practices of how people view, represent, manage, and interact with information. To this end, I have followed four intertwined themes: process, presentation, representation, and interaction. My research process convolves art, science, and design practices, and has become a topic of research in itself. Presentation is the act of displaying visuals, emphasizing and organizing areas of interest. Representation is development of accurate and revealing data-to-visual mappings. And interaction is the key to exploration and manipulation capabilities that can make information comprehension viable. In this talk, I will show how each theme is opening up to indicate exciting new directions and discuss how the currently shifting information climate is opening up new opportunities.

Sheelagh Carpendale is a Professor in Computer Science at the University of Calgary where she holds a Canada Research Chair in Information Visualization and NSERC/AITF/SMART Technologies Industrial Research Chair in Interactive Technologies. She leads the Innovations in Visualization (InnoVis) Research Group and has initiated the new interdisciplinary graduate- level specialization, Computational Media Design. Her research on information visualization, large interactive displays, and new media draws on her dual background in Computer Science (BSc. and Ph.D. Simon Fraser University) and Visual Arts (Sheridan College, School of Design and Emily Carr, College of Art). She has just been awarded a NSERC STEACIE Memorial Fellowship in recognition of her outstanding research. She is an internationally renowned leader in both information visualization and multi-touch tabletop interaction and has recently served in such roles as Papers, Program, or Conference Chair for IEEE InfoVis, and ACM Tabletop and has received both the IEEE and ACM recognition of service awards.

Coast to Coast Seminar Series: "Contaminants, climate and the Arctic Ocean: What can we do in this sea of change?" Live from University of Victoria

Robie Macdonald

Date: Nov 27, 2012
Time: 11:30 - 12:30
Room: ASB10900

Abstract

During the past three decades the Arctic has been undergoing unprecedented change due to global warming. In addition to stress brought on by warming, the inhabitants of the Arctic are exposed to industrial and agricultural contaminants that have arrived there via air and water. I will examine some of the interactions between climate change/variability and contaminants mainly with the view of showing the need for understanding linkages between system components before proposing action. Interactions make it exceptionally difficult to isolate the consequences of these two sorts of stresses. Finally, I will look at the processes by which issue-oriented science gets funded and some of the problems we face as a society in using the resultant science to prioritize action through policy.

Coast to Coast Seminar Series: "Oil & Fish Tails: Cuts to Canada's environment and the changing face of Metro Vancouver's oil and gas industry" Live at SFU

Members of Parliament

Date: Nov 13, 2012
Time: 11:30 - 12:30
Room: ASB10900

Abstract

Donnelly: Two of the strongest environmental laws in the country, the Fisheries Act and the Canadian Environmental Assessment Act, were significantly altered with the recent passing of the federal omnibus budget bill. Join us for a discussion about the ramifications for Canada's fishery and our natural environment.

Stewart: Where every country needs a national science policy to further scientific advancement, there is an inherent tension between scientific practice and government science policy. Both the natural and social sciences require researchers be free to ask questions and employ scientific tools such as peer-review to discover answers. However as science almost always requires public funding, governments will always attempt to steer scientific endeavor to meet political objectives. With this in mind, any national science policy needs to find the right balance between funding levels and the extent to which government controls the scientific agenda. This presentation more fully outlines this tension and explains the institutions and players found in the current Canadian science ecosystem. It then moves to explain recent changes to Canada's approach to science, using current changes to the National Research Council as an example. The goal of this presentation is to generate discussion as to the future direction of Canada's National Science Policy.

Coast to Coast Seminar Series: "The Role of Science in Marine Policy-Making" Live from Ottawa

Jake Rice

Date: Oct 30, 2012
Time: 11:30 - 12:30
Room: ASB10900

Abstract

The talk will commence with setting a background for science-based marine policy-making. The first background component is a review of the standard principles (platitudes?) about the role of science in policy-making generically. The review will consider both why each principle is considered important and the implications of each one for dynamics of science advisory processes and interactions between science advisors and policy makers. The other component of the context for this talk will be a review of the meaning of sustainable use/development in natural resource management and policy. These two components of the context for science - policy interface will be brought together with a brief discussion of what "integration" means in policy-making and the science advice which supports it.

The talk will move on to the special challenges of policy-making in marine environments, considering both areas within national jurisdictions and areas beyond national jurisdiction. It will develop the thesis that drivers of marine policy making have a strong top-down nature. Broad commitments regarding conservation and sustainable use and development are made in very high-level fora such as Rio (1992), Johannesburg (2002), and Rio+20 (2012). These become translated into paragraphs of greater specificity in a pair of annual UN General Assembly Resolutions on Sustainable Fisheries and on Ocean and Law of the Sea. Once commitments are adopted at the UN level, the major UN intergovernmental agencies such as the Food and Agriculture Organization and the Convention on Biological Diversity, take over developing implementation frameworks for use by Parties and sectors. These frameworks are finally taken up at the national level within national jurisdictions, and in complex governance arrangements in areas beyond national jurisdiction, and implemented with policies, regulations, and occasionally even legislation.

At every stage in this top-down process science advice is needed. However, the nature of the science advice changes at each stage, as do the dynamics of the science-policy interface. The fact that below the very highest level, the conversion of broad commitments to specific policies and practices occurs in two parallel streams - a fisheries sectoral governance stream and a biodiversity conservation governance stream, poses more challenges than just duplication of effort. Many of the warts hiding in the principles and platitudes of how the science-policy interface works, and the nature of science advice itself, become revealed in how these two streams play out in parallel, each striving for implementation of common commitments, but each with different histories and different features. The talk will illustrate those "challenges" in the science-policy interface with specific examples such as "ecologically and biologically significant areas" and "vulnerable marine ecosystems". The wrap-up of the talk will consider whether the marine science-policy process is just a flawed divergence from the idealized science-policy interface, or if the imperfections in the marine science-policy process are in fact ways that the real world differs from an abstract and imaginary ideal.

Coast to Coast Seminar Series: "Disease, Aquaculture, and Pacific Salmon Management" Live from SFU

Department of Statistics and Actuarial Science, Simon Fraser University

Date: Oct 16, 2012
Time: 11:30 - 12:30
Room: ASB10900

Abstract

The speaker will present his perspective on the tumultuous events that followed his receipt last fall of a report of positive test results for the infectious salmon anaemia virus (ISAv) in Rivers Inlet sockeye salmon. The account will include the immediate reaction, subsequent revelations, and results from follow-up sampling. The story will highlight instances of what the speaker views as a serious disregard for the role of scientific evidence in the development of governmental policy - and consequent, unacceptable risks to wild Pacific salmon. The presentation will conclude with a proposal for organizational reform aimed at promoting the role of scientific research in the protection of wild Pacific salmon.

Coast to Coast Seminar Series: "Climate Delusions" Live from SFU

School of Resource and Environmental Management, Simon Fraser University

Date: Oct 02, 2012
Time: 11:30 - 12:30
Room: ASB10900

Abstract

From the research of natural scientists, Al Gore talks of an inconvenient truth: that human's are heating the planet, especially by burning fossil fuels to emit carbon pollution. But from the research of social scientists, we know of a second inconvenient truth: that human's are prone to delude themselves and others about real-world evidence because of self-interest, convenience and preference, and this is preventing us from effective action to minimize climate change.

The effort to stop global warming is frustrated by an array of delusions in which evidence is ignored or fabricated. Even worse, it is not just climate science skeptics who do this. Even people who want action on climate change ignore evidence and sustain delusions that prevent effective action. This includes people who believe that peak oil is imminent, that energy efficiency is cheap and easy, that behavioral change is necessary and effective, that renewables can soon outcompete fossil fuels, that carbon offsets lead to carbon neutrality, and that a global agreement can be reached by voluntary consensus.

This talk explains why leading social science researchers know that these are delusions, and more importantly what to do in order to act in time to prevent massive species extinctions and major human costs.

Coast to Coast: Seminar Series: "Global change impacts on biodiversity: the view from Canada"

Conservation Biology and Macroecology, University of Ottawa

Date: Sep 18, 2012
Time: 11:30 - 12:30
Room: ASB10901

Abstract

The combined effects of habitat loss and proliferation of introduced species present serious conservation challenges. These aspects of global change have created a black hole for species in Canada and globally, pulling many toward extinction. Human activities have added climate change to this dangerous mix. Recent research improves capacity to predict species impacts of such effects.

Species losses can erode the robust provision of economically and ecologically indispensable ecosystem services, like pollination. In the past 25 years, several wild pollinator species have nearly totally collapsed in North America. Although habitat loss, introduced diseases, and pesticide use have not helped, we present new evidence that climate change alone could explain some bumblebee losses.

Massive increases in weather extremes can precipitate species collapses, even among widespread, abundant insect pollinators. These effects, known from the paleoecological record, have not previously been linked to a modern extinction.

Further losses of species and ecosystem service degradation are not inevitable. Informed by concerted scientific action and an involved public, elected leaders sometimes take landmark steps to conserve wilderness areas and strengthen legal frameworks protecting species at risk.

Coast to Coast Seminar Series: "Planning and Control of Massive Networks"

Ausgrid Chair of Electrical Engineering, University of Sydney

Date: Apr 03, 2012
Time: 16:00 - 17:00
Room: ASB10901

Abstract

The modernization of infrastructure networks requires coordinated planning and control. Considering traffic networks and electricity grids raises similar issues on how to achieve substantial new capabilities of effectiveness and efficiency. For instance, power grids need to integrate renewable energy sources and electric vehicles. It is clear that all this can only be achieved by greater reliance on systematic planning in the presence of uncertainty and sensing, communications, computing and control on an unprecedented scale, these days captured in the term smart grids'. This talk will outline current research on planning future grids and control of smart grids. In particular, the possible roles of network science will be emphasized and the challenges arising.

Coast to Coast Seminar Series: "How to Build a Brain: From Single Cells to Cognitive Systems"

Department of Philosophy, Department of Systems Design Engineering, Canada Research Chair in Theoretical Neuroscience, Director, Centre for Theoretical Neuroscience, University of Waterloo

Date: Mar 20, 2012
Time: 11:30 - 12:30
Room: ASB10901

Abstract

How do billions of single neurons result in the complex behaviors we observe in animals and in ourselves? In this talk, I discuss my lab's approach to answering this question. In short, we build large-scale simulations at the level of single cells, which exhibit a wide range of flexible, dynamic, and cognitive behaviors. I discuss why the principles we employ are reasonable, and describe the benefits, successes, and challenges of this research.

Coast to Coast Seminar Series: "Five-Colour Theorem and Beyond"

Canada Research Chair in Graph Theory, Department of Mathematics, Simon Fraser University

Date: Mar 06, 2012
Time: 11:30 - 12:30
Room: ASB10900

Abstract

In 1994, Carsten Thomassen published a beautiful simple proof confirming that every planar graph is 5-list-colourable. Another beautiful proof on a similar topic was given a few years later by Mike Albertson who proved that every precolouring of a set of vertices in a planar graph that are far apart from each other can be extended to a 5-colouring of the whole graph. After presenting these enlightening contributions, the speaker will discuss possible common generalizations of these results and report on some recent progress.

Coast to Coast Seminar Series: "Aging in Individuals and Populations/Mathematical Modeling"

Department of Medicine, Department of Community Health and Epidemiology, Department of Mathematics and Statistics, Dalhousie University

Date: Feb 21, 2012
Time: 11:30 - 12:30
Room: ASB10901

Abstract

Aging and its complexity, and regularities- the Gompertz law of mortality. Chronological vs. biological aging. Aging as a process of deficits accumulation. The frailty index as a proxy-measure of individual and population aging. The concept of equality of health deficits. Phenomenological invariants of aging: aging rates; sex-related differences; limit in the deficits accumulation; compensation laws of mortality and deficits accumulation. Stochastic dynamics of age trajectories. Irreversibility of chronological aging and local reversibility of biological aging. What is the law that governs changes in health during aging? Aging, health and wealth, how they are related world wide.

Coast to Coast Seminar Series: "Spectral Analysis and Dynamical Behavior of Complex Networks"

School of Engineering Science, Simon Fraser University

Date: Feb 07, 2012
Time: 11:30 - 12:30
Room: ASB10900

Abstract

Discovering properties of the Internet topology is important for evaluating performance of various network protocols and applications. The discovery of power-laws and the application of spectral analysis to the Internet topology data indicate a complex behavior of the underlying network infrastructure that carries a variety of the Internet applications. In this talk, we present analysis of datasets collected from the Route Views and RIPE projects. The analysis of collected data shows certain historical trends in the development of the Internet topology. While values of various power-laws exponents have not substantially changed over the recent years, spectral analysis of matrices associated with Internet graphs reveals notable changes in the clustering of Autonomous Systems and their connectivity.

Coast to Coast Seminar Series: "Modelling healthcare policy decisions for planning and impact analysis"

Centre for Research in Healthcare Engineering, Mechanical and Industrial Engineering, University of Toronto

Date: Nov 29, 2011
Time: 11:30 - 12:30
Room: ASB10901

Abstract

Michael Carter is a Professor in the Department of Mechanical and Industrial Engineering at the University of Toronto and Director of the Centre for Research in Healthcare Engineering. He received his doctorate in Mathematics from the University of Waterloo in 1980. Since 1989, his research focus has been in the area of health care resource modeling with a variety of projects in hospitals, home care, rehab, long term care, medical labs and mental health institutions. He has supervised more than 160 engineering students in over 100 projects with healthcare institutions. He currently has 18 students (7 doctoral, 5 masters and 6 undergrad) working in the area. He was the winner of the Annual Practice Prize from the Canadian Operational Research Society (CORS) four times (1988, 1992, 1996 and 2009). In 2000, he received the CORS Award of Merit for lifetime contributions to Canadian Operational Research. He also received an Excellence in Teaching Award from the University of Toronto Student Administrative Council. He is on the editorial board for the Journal of Scheduling and the journal Health Care Management Science. Professor Carter is co-editor of an issue of Interfaces on Healthcare Applications. He is a member of the Nursing Effectiveness, Utilization and Outcomes Research Unit and a mentor in the Health Care, Technology and Place Program at the University of Toronto. He was a lecturer with the Project H.O.P.E. international program in Healthcare Quality in Central and Eastern Europe in 2002 (Estonia & Latvia) and 2003 (Hungary & the Czech Republic). He is on the Advisory Board for the Regenstreif Centre for Healthcare Engineering at Purdue University. He is an Adjunct Scientist with the Institute for Clinical Evaluative Sciences in Toronto (www.ices.on.ca).

Coast to Coast Seminar Series: "The effects of confinement and landscape fragmentation on predator-prey dynamics"

Department of Physics, University of Prince Edward Island

Date: Nov 15, 2011
Time: 11:30 - 12:30
Room: ASB10901

Abstract

A better comprehension of animal movement is vital to interpreting key ecological and evolutionary processes such as the spatial-temporal patterns of resource selection, foraging behaviour, and predator-prey interactions. As human activities continually alter landscapes and influence the behaviour and movement patterns of organisms, a variety of pressing ecological and health issues are emerging, such as the spread of invasive species and infectious diseases. Hence, advances in our understanding of animal movement will have direct implications in several disciplines including landscape ecology, conservation biology, and wildlife management, as well as those dealing with public health. In this talk, I will discuss our recent studies on the effects of confinement and landscape fragmentation on predator-prey dynamics through the use of a robust individual-based movement model (IBMM). The relative foraging efficiency for different predator (and prey) search models is examined, including the area-restrictive search, Levy walk, and a composite correlated random walk (CCRW) model, under different confinement and fragmentation conditions. In addition, a number of movement metrics are calculated, including the move-length distribution, the net squared displacement, the radius of gyration, and the turning-angle correlation function, to examine the effects of confinement on scaling behaviour. The simulation results will be compared with recent field studies that we have conducted on the red fox (Prince Edward Island) and the wild dog (South Africa).

Coast to Coast Seminar Series: "Optimization models and methods for radiation therapy treatment planning"

Department of Mathematics and Statistics, University of Calgary

Date: Nov 01, 2011
Time: 11:30 - 12:30
Room: ASB10901

Abstract

External radiation is a primal modality in treating various cancers. The clinical outcome of the treatment directly relates to the radiation dose delivered to the patient. A fundamental question in treatment planning "can we produce better plans relying on the existing technology?" still remains unanswered, in large part due to the underlying complexity of the problem. We overview some optimization techniques that have the potential to improve the situation.

Coast to Coast Seminar Series: "Exploiting sub-structure in non-smooth optimization problems"

Department of Mathematics and Statistics, University of British Columbia, Okanagan Campus

Date: Oct 18, 2011
Time: 11:30 - 12:30
Room: ASB10901

Abstract

Mathematical Optimization, the study of how to locate maximizers and minimizers of a function, arises naturally in almost every scientific research field. Applications can be found in everything from microchips to forest roads. In many applications, the underlying optimization problem is non-differentiable, discontinuous, or worse. This has lead to new collections of very robust and powerful algorithms that work a a huge variety of problems. However, such approaches are often too slow for practical usage. In this talk we discuss examples of how a close examination of an optimization problem can often reveal substructures that can used to help understand and solve the problem.

Coast to Coast Seminar Series: "Modelling of cell movement in tissue and application to glioma growth"

Department of Mathematical and Statistical Sciences, University of Alberta

Date: Oct 04, 2011
Time: 11:30 - 12:30
Room: ASB10901

Abstract

In this talk I will study mathematical models for the movement of cells in aligned tissue. These cells are typically caner metastasis, which invade along fibre tracks into healthy tissue. A new MRI modality called DTI imaging (diffusion tensor imaging) can be used to measure the fibrous structure inside the brain (e.g. white matter tracks). I will discuss how transport equations and non-isotropic diffusion equations can be used to implement DTI data into the modeling of glioma growth.

Coast to Coast Seminar Series: "The importance of spatial analysis when modeling social systems"

School of Criminology, Simon Fraser University

Date: Sep 20, 2011
Time: 11:30 - 12:30
Room: ASB10900

Abstract

When modeling complex social systems many dimensions need to be considered. In this presentation, I argue that space, or geography, is one of those necessary dimensions. Beginning with a brief overview of "ecological" investigations that use spatially-referenced data but do not always map the data, I show the utility of mapping this information in a number of ways. First, I show how visualizing data allows for a clearer picture of your research area to emerge. Second, this is followed by the use of exploratory spatial data analysis and identifying the strength of spatial relationships. Third, I provide a brief overview of classical versus spatial regression techniques. Fourth, I discuss the importance of local spatial statistics, both as an exploratory tool and in an inferential context. Lastly, I discuss the importance of scale in any spatial analysis using a recently developed spatial point pattern test.

Coast to Coast Seminar Series: "A Framework for Modeling Network Risk"

Department of Economics, SFU

Date: Mar 29, 2011
Time: 11:30 - 12:30
Room: ASB10900

Abstract

The complexity of the financial system is increasing at an accelerating pace. This evolution is driven not only by technology but also by business practices. For example, as risk managers reach out for different exposures in the pursuit of diversification, the financial network becomes more tangled. The ensuing uncertainty has been indicated as one of the causes that aggravated the global financial crises.

In this talk, we will cover some recent results on the econometrics of networks and describe an empirical framework for quantifying network risk. These tools can be used to analyze liquidity in a decentralized market, to understand the trade off between diversification gains and increased network complexity, and to evaluate policies that can mitigate the uncertainty faced by the market participants about the network structure.

Coast to Coast Seminar Series: "Bankers, Bonuses and Busts"

Cheriton School of Computer Science, University of Waterloo

Date: Mar 15, 2011
Time: 11:30 - 12:30
Room: ASB10908

Abstract

It is commonly believed that the current problems in the capital markets are a result of the financial models developed by academic mathematicians and industry practitioners. In fact, many of us have been pointingout for years that banks were involved in very risky activities, which produced illusory short term profits.

In this talk, I will give a short introduction to the theory behind pricing and hedging derivative contracts (options). A derivative contract is based on an underlying asset. The standard model for the underlying asset price movement assumes that prices evolve according to a random walk with a drift. It is possible for an option seller to set up a hedging portfolio, which is then dynamically rebalanced in response to changes in the underlying asset price. Then, regardless of the random movement of the asset price, the seller of the option is able to pay out the value of the this contract at expiry.

There is strong evidence that the normal market behavior assumed by standard models is punctuated by occasional large jumps or drops in prices (e.g. subprime mortgages). These processes are called "jump diffusion" models.

A simulation of a trading strategy which exploits these market characteristics shows that bankers can produce apparent profits for many years, followed by enormous losses. The compensation system widely used in the financial sector encourages these sorts of activities.

In essence, this bonus system allows executives and traders to be rewarded for apparent short term profits, and to walk away unscathed after producing staggering losses.

Coast to Coast Seminar Series: "Analysis of Contingent Capital Bonds in Merton-type Structural Models"

Department of Applied Mathematics, University of Western Ontario

Date: Mar 01, 2011
Time: 11:30 - 12:30
Room: ASB10901

Abstract

ontingent capital bonds (CCB) are securities which "begin life" as subordinated debt, and convert to equity if the issuing firm becomes financially distressed. CCB have recently begun to attract attention as a means to shift the cost of supporting distressed financial institutions from taxpayers to shareholders, thereby enforcing "market-based" discipline. At the present time, however, the discussion surrounding CCB has been largely heuristic. In this talk we attempt, via Merton-type structural models, to shed theoretical light on two fundamental issues regarding CCB. The first is their cost (i.e. par yield), which we find to be surprisingly cheap. Indeed overall debt costs are reduced when CCB are introduced to the capital structure. The second issue we investigate is how CCB respond to changes in fundamental parameters such as leverage and volatility. We find that the answer to this question depends critically on the conversion price, and different prices (for example fixed versus market-based) lead to radically different behaviour.

Joint work with R. Mark Reesor, Applied Mathematics, University of Western Ontario

Coast To Coast: "Global Financial Crisis: A Corporate Governance Problem"

Vijay Vishwakarma

Date: Feb 15, 2011
Time: 11:30 - 12:30
Room: ASB10901

Abstract

U.S. sub-prime mortgage crisis of year 2007 which turned into Global financial crisis in year 2008 has brought corporate governance practices back under the spotlight. Global financial crisis was a mix of macro-economic policy, financial regulations, politics and corporate governance weaknesses. Various studies point to governance problems in banks as the key factor (Kashayp, 2010). Out of the many corporate governance variables like board, compensation, shareholder rights, risk management etc researchers have identified board issues as an object warranting immediate attention (Kirkpatrick, 2008). Causes and symptoms of the global financial crisis are very similar to Asian Financial Crisis of 1997; many researchers blamed weak corporate governance as one of the causes of crisis (Stiglitz, 1998, Harvey and Roper, 1999 and Greenspan, 1999). Canada which has performed much better in the global financial crisis as compared to any other industrialized economy can become a real-time lesson for rest of the world, the lesson, that prudence and good regulations work in long run.

Coast to Coast Seminar Series: "Central Banks Shore up their Arsenals"

Department of Economics, Dalhousie University

Date: Feb 01, 2011
Time: 11:30 - 12:30
Room: ASB10908

Abstract

Effective monetary policy in hard economic times depends on the ability of the central bank to influence consumer behaviour through adjustments to interest rates and exchange rates. When the Bank of Canada dropped its overnight bank rate to one quarter of a percent in April 2009, observers began to worry that the bank was out of bullets. In an extraordinary move the bank issued a conditional commitment; they would keep the bank rates at that low level for at least a year. This new approach had the stimulus effect they had hopped for. Central banks in other countries, particularly in the US and the UK have not been so lucky. The financial crisis has inspired macro economists to rethink how we conduct monetary policy at the effective lower bound of interest rates. Central banks have had to shore up their arsenals, if for no other reason but to instill some public confidence in their ability to influence the economy.

About the speaker: Marina Adshade is an assistant professor in the department of economics at Dalhousie University. She has a Ph.D. in in economics from Queen's University and came to Dalhousie after a completing postdoctoral research with Team for Advanced Research in Globalization, Education and Technology (TARGET) the University of British Columbia. Her areas of specialization are macroeconomics and economic history with a particular interest in female labour markets. She is a regular contributor with the Globe and Mail's Economy Lab, a columnist with New York Magazine and writes blog for the website Big Think. http://bigthink.com/blogs/dollars-and-sex )

Coast to Coast Seminar Series: "Climate Impacts of Freshwater Forcing of the Ocean General Circulation"

Centre For Global Change Science, University of Toronto

Date: Nov 30, 2010
Time: 11:30 - 12:30
Room: ASB10901

Abstract

During the past million years of Earth history, climate variability has been dominated by a 100 kyr cycle of continental scale glaciation and deglaciation. Each of these quasi-periodic events owed its existence to the minute variations in the distribution of solar radiation caused by gravitational n-body effects in the solar system. In each cycle of this process continental glaciation was accompanied by a fall of mean sea level of approximately 120 m. The glaciation phase of each cycle persisted for approximately 90,000 years whereas the deglaciation phase was much more rapid, lasting approximately 10,000 years. During deglaciation, the return of freshwater to the ocean basins was responsible for highly significant disruptions of climate, foremost among which was the so-called "Younger-Dryas" climate reversal during which northern hemisphere surface temperatures were forced to return to near full-glacial cold conditions even as the system was in the process of returning to a state of modern warmth. This phenomenon provides a target for testing the transient response of the global climate models that are employed to make predictions of the influence of global warming due to increasing concentrations of the atmospheric greenhouse gases. This test will be described in detail.

Coast to Coast Seminar Series: "How will Marine Ecosystems Adapt to a Future Ocean that will be Warmer, More Stratified, More Acidic and Less Oxygenated?"

DFO Institute of Ocean Sciences, and EC Canadian Centre for Climate Modelling and Analysis, University of Victoria

Date: Nov 16, 2010
Time: 11:30 - 12:30
Room: ASB10900

Abstract

Primarily from burning fossil fuels, humans are adding increasing amounts of the greenhouse gas carbon dioxide to the atmosphere. More than a third of this new carbon dioxide ends up in the ocean, and more than 90% of the additional heat from the greenhouse effect is entering the oceans. As a result the oceans are becoming warmer and more stratified, which reduces the mixing of nutrients from below up into the surface ocean and of oxygen from the surface layer down into the subsurface ocean. In addition the extra carbon dioxide is causing the oceans to become more acidic. Can we predict how whole marine ecosystems will adapt, when we do not yet know how much capacity individual species have to adapt to these expected changes to their environment. I will outline a modelling framework to explore the capacity of species to adapt to a changing environment based on existing 'phenotypic' diversity and potential 'plasticity'.

Bio

Ken Denman is a Senior Scientist with Fisheries and Oceans Canada (DFO), since 2000 working at the Canadian Centre for Climate Modelling and Analysis of Environment Canada, located at the University of Victoria where he is an Adjunct Professor. His research involves the interactions between marine ecosystems, biogeochemical cycles and climate change. His current research interests centre on forecasting the responses of marine ecosystems to the acidification of the oceans and to other aspects of climate change including possible geoengineering measures. He was Coordinating Lead Author of Chapter 7 of the 2007 Intergovernmental Panel on Climate Change (IPCC) WG1 AR4 titled "Couplings between changes in the climate system and biogeochemistry"; and Coordinating Lead Author of Chapter 10 in the Second Assessment Report (1995) of IPCC WG1, titled "Marine biotic responses to environmental change and feedbacks to climate". The IPCC shared the 2007 Nobel Peace Prize with Al Gore for its work on climate change. Ken Denman is a Fellow of the Royal Society of Canada, and has received the President’s Prize of the Canadian Meteorological and Oceanographic Society, the T.R. Parsons Medal for excellence in ocean science, and the Wooster Award of the North Pacific Marine Sciences Organization (PICES) for research excellence in the North Pacific. He has served on the Steering Committees of the Joint Global Ocean Fluxes Study (JGOFS), the Global Ocean Observing System (GOOS), and the Surface Ocean Lower Atmosphere Study (SOLAS). He recently completed 6 years as a member of the Joint Scientific Committee of the World Climate Research Programme. He received a PhD in ocean physics from the University of British Columbia.

Coast to Coast Seminar Series: "A Sea of Change"

Department of Oceanography, Dalhousie University

Date: Nov 02, 2010
Time: 11:30 - 12:30
Room: ASB10901

Abstract

The world's oceans are undergoing alterations not seen in hundreds of thousands of years. The surface ocean is warmer, more acidic and deeper reaches are increasingly deoxygenated. New evidence now indicates that the oceans have lost a significant portion of their phytoplankton over the last 50 years - the base of the marine food chain - and as well appear to support only 10% of the large predators active prior to the industrial age. These are not isolated changes, and appear to have some - albeit complex - common causes. This talk will focus on the scientific evidence to date, with a view towards what the future might bring.

Coast to Coast Seminar Series: "Anthropogenic Influence on Long Return Period Daily Temperature Extremes at Regional Scales"

Director, Pacific Climate Impacts Consortium

Date: Oct 19, 2010
Time: 11:30 - 12:30
Room: ASB10900

Abstract

There is now a well established approach to detecting and attributing the causes of observed changes in mean climatic conditions that has been applied progressively from global scales to regional scales to temperature and other climate variables. While this research has provided a great deal of useful information about the causes of climate change observed during the past century or more, policy makers and others have also been demanding answers about whether there are attributable changes in frequency and/or intensity of extreme weather and climate events. The statistical techniques required to respond to these questions are only now begin developed. This talk will describe a standard technique that is used in climate change detection and attribution research, propose a parallel approach that might be used to assess whether there is a detectable human influence in the far tails of the distribution of a climate variable such as daily maximum air temperature, demonstrate an initial application of the approach, and discuss limitations and further areas of improvements. Using the approach that is proposed, we show that an anthropogenic influence is detectable globally, and in many regions, in the extremes of daily maximum and minimum temperatures. Globally, waiting times for extreme annual minimum daily minimum and daily maximum temperatures events that were expected to recur once every 20 years in the 1960s are now estimated to exceed 35 and 30 years respectively. . In contrast, waiting times for circa 1960s 20-year extremes of annual maximum daily minimum and daily maximum temperatures are estimated to have decreased to less than 10 and 15 years respectively.

Coast to Coast Seminar Series: "Insight into Lower Atmospheric Composition from Remote Sensing and Modeling"

Department of Physics and Atmospheric Science, Dalhousie University

Date: Oct 05, 2010
Time: 11:30 - 12:30
Room: ASB10901

Abstract

Satellite remote sensing of atmospheric composition (such as aerosols, ozone, and their precursors) has progressed markedly over the past decade. Global numerical modeling plays a critical role in interpreting these observations. This talk will highlight recent advances in both remote sensing and global modeling of the troposphere, and their application for insight into processes affecting climate and global air quality.

About the speaker: Randall Martin is a Killam Professor in the Department of Physics and Atmospheric Science at Dalhousie University, and a Research Associate at the Harvard-Smithsonian Center for Astrophysics. He received a B.S. from Cornell University in Engineering in 1996, a M.Sc. in Environmental Science from Oxford University in 1998, and a Ph.D. from Harvard University in 2002 with a focus on Atmospheric Chemistry. He is a recipient of the Langstroth Memorial Teaching Award, an NSERC Discovery Accelerator Supplement, and a Killam Prize. He has published 70 peer-reviewed journal articles on the processes that affect atmospheric composition, and their implications for climate and air quality.

Coast to Coast Seminar Series: "Applying Machine Learning Methods to Climate Variability"

Department of Earth and Ocean Sciences, University of British Columbia

Date: Sep 21, 2010
Time: 11:30 - 12:30
Room: ASB10900

Abstract

Machine learning methods, having originated from computational intelligence (i.e. artificial intelligence) are now ubiquitous in the environmental sciences. Applications of machine learning methods, such as neural networks and support vector machines, to the analysis of climate variability and to short term climate prediction will be presented. Examples include the El Nino phenomenon in the tropical Pacific, and interannual variability in the Canadian winter climate and extreme weather.

Bio

William Hsieh obtained from the University of British Columbia his B.S. degree in combined honours mathematics and physics (1976), an M.S. in physics (1978), and a Ph.D. degree in oceanography and physics (1981). He did postdoctoral work at Cambridge University and at the University of New South Wales, before returning to the University of British Columbia, where he eventually became Professor in the Department of Earth and Ocean Sciences and in the Department of Physics and Astronomy, as well as the Chair of the Atmospheric Science Programme. He is currently professor emeritus, with an active research group. Best known for his pioneering work in developing and applying machine learning methods in the environmental sciences, he has over 90 peer-reviewed publications covering areas of climate variability and prediction, machine learning, oceanography, atmospheric science and hydrology. His graduate-level book "Machine Learning Methods in the Environmental Sciences -- Neural Networks and Kernels" (2009) was published by Cambridge University Press.

Coast to Coast Seminar Series: "Reading the Tea Leaves: What Lies Beyond the Standard Model?"

Department of Physics & Astronomy, McMaster University, Perimeter Institute

Date: Mar 30, 2010
Time: 11:30 - 12:30
Room: ASB10900

Abstract

The turn-on of the Large Hadron Collider (LHC) will likely fundamentally change our picture of how nature works on the smallest of distances we can probe. This lecture reviews the case for why failure to discover something is believed not to be an option; and what the successes and failures of the Standard Model tell us about what is likely to be, and not to be, out there awaiting discovery. Most proposals fall into three main categories, whose broad properties are outlined. I close with my personal opinions about what will be found.

Coast to Coast Seminar Series: "Flavour Physics: The Generation Puzzle: Symmetries and Mysteries"

Department of Physics and Astronomy, University of Victoria

Date: Mar 16, 2010
Time: 11:30 - 12:30
Room: ASB10900

Abstract

The world we experience is essentially made of three fundamental particles: the electron and the two kinds of quarks that make up protons and neutrons. Yet nature has chosen to copy this structure at least twice more, with each copy heavier than the last. How have these extra "generations" shaped the universe we live in? Studies of particles containing the heavier quarks have revealed fascinating phenomena: the pure left-handed nature of the charged Weak interaction; the spontaneous transmutation of matter into antimatter and back; and a mechanism for breaking matter-antimatter symmetry, which may be connected to the dominance of matter in our universe. Sensitive measurements in this area have guided the development of our theories and will provide constraints on any new theories that may be proposed in light of discoveries at the Large Hadron Collider. This talk will review the highlights of flavour physics from the discovery of "strange" particles through the present.

Coast to Coast Seminar Series: "The Unbearable Lightness of Being (A Neutrino)"

Department of Physics and Astronomy, University of British Columbia

Date: Mar 02, 2010
Time: 11:30 - 14:30
Room: ASB10900

Abstract

If you took an electron and stripped away all of its charge and all of its mass, would you have anything left? Incredibly enough, the answer is yes---a neutrino! Invented originally as an "accounting trick" to balance the books in nuclear reactions, we now know neutrinos to be among the lightest and hardest to detect particles in the world. Billions of them are flying through your body as you read this abstract. I will explain what we know about these phantom-like particles and the unique challenges we face in studying their properties.

Coast to Coast Seminar Series: "Probing the Origin of Mass: The First Light of ATLAS Data"

Department of Physics and Astronomy, University of Victoria and Institute of Particle Physics

Date: Feb 16, 2010
Time: 11:30 - 12:30
Room: ASB10900

Abstract

Four decades of experimental results and theoretical developments point us to energies of one trillion electron volts, or about a thousand times the mass of the proton, to search for the processes that give mass to elementary particles. Reaching such high energies required a new particle accelerator, the Large Hadron Collider (LHC), which has recently begun operation at the CERN laboratory in Geneva, Switzerland. The physics case for the LHC and the massive ATLAS detector which records the results of the interactions that might produce the Higgs boson or other new particles is discussed, and an LHC status report including a first look at ATLAS data is presented.

Coast to Coast Seminar Series: "Experimental Techniques in Particle Physics or 'What are they really doing in Geneva?!'"

Department of Physics, Simon Fraser University / TRIUMF

Date: Feb 02, 2010
Time: 11:30 - 12:30
Room: ASB10900

Abstract

With the recent startup of the Large Hadron Collider (LHC) at CERN in Geneva, there has been renewed interest in particle physics, which has led to a plethora of articles and presentations for the public on what is being done at the new experiments. This colloquium will present not the what, but the how. How do physicists study Nature at incredibly small distance scales? It is perhaps paradoxical that viewing the world at very small scale requires the largest machines ever built. This talk will present the basic physics concepts involved in experimental subatomic physics. This includes a description of the gigantic accelerators (the probes), and detectors (the eyes) used. Particle physics experiments produce an enormous amount of data. This talk will also discuss the large-scale computing necessary to mine these data, as well as the advanced analysis techniques required to extract very rare events from the preponderance of well-understood background processes.

Coast to Coast Seminar Series: "A Tour of Particle Physics"

Department of Physics, University of Toronto / TRIUMF

Date: Jan 19, 2010
Time: 11:30 - 12:30
Room: ASB10900

Abstract

Particle physics is entering a new era with the startup of the Large Hadron Collider (LHC) at CERN in Geneva. This revolutionary new instrument will open the door to many new discoveries that will shed light on the structure of the Universe at the highest energy scales ever studied. This talk is a survey of the important open questions in particle physics, many of which will be addressed at the LHC. It will also serve as an introduction to the subsequent colloquia in the Coast-to-Coast series for winter/spring 2010.

Coast to Coast Seminar Series: "Real Intelligence: The Anticipating Brain"

Dalhousie University

Date: Dec 01, 2009
Time: 11:30 - 12:30
Room: ASB10900

Abstract

The area of AI was always inspired by the human mind. I will try to give some perspective of new directions in AI from new hypothesis about cognitive processes and neuroscience. Huge progress has been made in the scientific areas of neuroscience on one side and machine learning on the other, but it seems that these areas developed largely independently since the exciting days of the perceptron half a century ago. However, there is now some exciting new convergence of these areas of research. In particular, generative systems have made a strong impact on machine learning, and probabilistic reasoning replaced most of traditional expert-system approaches. This seminar will explore related new developments in computational neuroscience, specifically systems with a large top-down component that are capable of learning to anticipate the world. We further discuss representations of uncertainties in the brain and how synaptic mechanisms may contribute in such systems.

Bio

Dr. Thomas Trappenberg is professor of computer science at Dalhousie University. After graduating with a PhD in particle physics from Aachen University, Germany, he held research positions at Dalhousie University, the RIKEN Brain Science Institute in Japan, and at Oxford University. He is the author of more than 60 scientific publications and author of the textbook Fundamentals of Computational Neuroscience', published by Oxford University Press, of which the second edition is about to be released.

Coast to Coast Seminar Series: "World-Mediated Robot Intelligence"

Simon Fraser University

Date: Nov 17, 2009
Time: 11:30 - 12:30
Room: ASB10900

Abstract

Unlike disembodied AI programs, robots are embedded in the same physical world as humans and other animals. Like animals, they must act appropriately - we say "with intelligence" - to achieve their goals. The real world presents problems of uncertainty and the danger of running out of energy. Yet the world presents resources that can help the robot, such as other agents. This talk discusses robot systems that sense and exploit regularities in the behaviour of other robots and animals to obtain energy and work. The recharging problem is fundamental and under-explored, yet solved in some way by all intelligent creatures. It can serve as a useful focus for AI and Artificial Life research, and is the central purpose of my Autonomy Laboratory.

Coast to Coast Seminar Series: "Cognitive Dynamic Systems"

McMaster University

Date: Nov 03, 2009
Time: 11:30 - 12:30
Room: ASB10900

Abstract

In this lecture, I will describe a new generation of engineering systems with cognition as the enabler.

I will begin by describing the perception-action cycle that is basic to the visual brain. Then I will demonstrate how cognitive information (signal) processing is so basic to the underlying theory and design of:
- cognitive mobile assistants,
which constitute the three pillars of my research program.

I will finish the lecture by doing two things:
- present new and exciting results on cognitive tracking radar and thereby demonstrate the power of cognition; and
- describe my vision on future research on Cognitive Dynamic Systems.

Coast to Coast Seminar Series

University of Toronto

Date: Oct 20, 2009
Time: 11:30 - 12:30
Room: ASB10900

TBA

Coast to Coast Seminar Series: "Computer (and Human) Perfection at Checkers"

University of Alberta

Date: Oct 06, 2009
Time: 11:30 - 12:30
Room: ASB10900

Abstract

In 1989 the Chinook project began with the goal of winning the human World Checkers Championship. There was an imposing obstacle to success -- the human champion, Marion Tinsley. Tinsley was as close to perfection at the game as was humanly possible. To be better than Tinsley meant that the computer had to be perfect. In effect, one had to solve checkers. Little did we know that our quest would take 18 years to complete. In this talk, the creator of Chinook tells the story of the quest for computer and human perfection at the game of checkers.

Coast to Coast Seminar Series: "Oscillations in a Patchy Environment Disease Model"

Department of Mathematics and Statistics, University of New Brunswick

Date: Mar 31, 2009
Time: 11:30 - 12:30
Room: ASB10901

Abstract

For a single patch SIRS model with a period of immunity of fixed length, recruitment-death demographics, disease related deaths and mass action incidence, the basic reproduction number R0 is identified. It is shown that the disease free equilibrium is globally asymptotically stable if R0 < 1. For R0 > 1, local stability of the endemic equilibrium and Hopf bifurcation analysis about this equilibrium are carried out. Moreover, a practical numerical approach to locate the bifurcation values for a characteristic equation with delay-dependent coefficients is provided. For a two-patch SIRS model with travel, it is shown that there are several threshold quantities determining its dynamic behavior and that 1) travel can reduce oscillations in both patches; 2) travel may enhance oscillations in both patches; 3) travel can also switch oscillations from one patch to another.

Coast to Coast Seminar Series: "Poisson Structures investigated with Computer Algebra"

Department of Mathematics, Brock University

Date: Mar 17, 2009
Time: 11:30 - 12:30
Room: ASB10901

Abstract

This seminar will discuss Poisson Structures, which play an important role in both pure mathematics and applications. After giving a brief explanation of this notion, it will be shown how the computation of Poisson Structures leads to the problem of solving overdetermined algebraic systems. In the following demonstration a package developed by one of the presenters will be used to solve such systems. During the talk extensive comments are made concerning general issues that come up in large scale computer algebra computations. If there is time examples of other large scale computations will be added.

Bio

Dr Odesski's research interests include Integrable Systems, Mathematical Physics, Computer Algebra, High Performance Computing, Representation Theory, Non-commutative Geometry, Algebraic Geometry. His main research interests are in Mathematical Physics in the sense of Mathematics inspired by ideas that come from Theoretical Physics. More precisely, he is interested in algebraic and geometric structures which come from quantum field theory, statistical mechanics and the theory of integrable systems. Currently he is Brock SharcNet Research Chair. Dr. Wolf's research interests include differential equations and integrability, computer algebra, General Relativity and special aspects of optimization and artificial intelligence. Dr. Wolf does work with computer algebra concerns algorithms to simplify and solve overdetermined systems of equations (linear/non-linear), (algebraic/ordinary differential (ODEs)/partial differential (PDEs)). These basic algorithms are applied in higher level programs for the determination of symmetries, conservation laws or other properties of differential equations. Applications include the classification of integrable systems of evolutionary scalar PDEs, vector PDEs, single and systems of supersymmetric evolutionary PDEs and recently integrable quadratic Hamiltonians with higher degree first integrals but also discrete integrable system from Discrete Differential Geometry.

Coast to Coast Seminar Series: "Pattern Formation in Reaction-Diffusion Systems"

Department of Mathematics and Statistics, Dalhousie University

Date: Mar 03, 2009
Time: 11:30 - 12:30
Room: ASB10908

Abstract

In this talk I will discuss the behaviour of two component reaction-diffusion systems. Typically the diffusion of one of the components will be much larger than the other. The difference in the diffusion coefficients can result in the formation of localized structures. I will consider the types of structures which can be formed and the wide variety of bifurcations these structures may undergo.

Coast to Coast Seminar Series: "Calculating, modelling and understanding turbulence using adaptive wavelets"

Department of Mathematics and Statistics, McMaster University

Date: Feb 17, 2009
Time: 11:30 - 12:30
Room: ASB10901

Abstract

Turbulence remains an outstanding challenge, both theoretically and computationally. In this talk I will explain why a theory of turbulence has proved elusive, and why it is difficult to accurately simulate realistic turbulent flows. Scientific computation based on the adaptive wavelet transform may dramatically reduce the complexity of accurate turbulence simulations. I will give examples of adaptive wavelet numerical simulations of turbulence which take advantage of turbulence intermittency. Finally, I will argue that effective and accurate simulations of turbulence will help advance our understanding of the mathematical structure of the Navier-Stokes equations in the limit of large Reynolds numbers.

Coast to Coast Seminar Series: "Differential Equation Models of Infectious Disease Dynamics"

Department of Mathematics and Statistics, University of New Brunswick

Date: Feb 03, 2009
Time: 11:30 - 12:30
Room: ASB10901

Abstract

The early disease transmission model of Kermack and McKendrick established two main results that are still at the core of most disease transmission models today: the basic reproduction number, Ro, as a threshold for disease spread in a population; and the final size of an epidemic. This early model consisted of a single integral equation for the incidence of infection over time. As models become more complex, the relationships between disease spread, final size and Ro are not as clear; yet Ro remains the main object of study when comparing control measures. This talk will formulate Ro for more complex models and outline results for the final size and bifurcations from the disease-free solution.

Coast to Coast Seminar Series: "Fourier Spectral Computing on the Sphere"

Department of Mathematics, Simon Fraser University

Date: Jan 20, 2009
Time: 11:30 - 12:30
Room: ASB10900

Abstract

Although spherical coordinates arise naturally in many applications, numerical routines for computing PDEs on a spherical surface are not yet commonplace tools for the relatively uninitiated. While spherical harmonics and finite-element methods are well-developed approaches, neither possesses the essential simplicity of computing on the 2D periodic domain with FFT-based spectral methods. It is little appreciated that fast Fourier transforms for the spherical surface have been implemented using the fact that longitude-latitude coordinates can be double-mapped to the torus. Combining this idea with a choice of Fourier basis for which the Laplacian is a sparse matrix operation, implicit time-stepping for diffusion is implemented in a spectrally-fast manner. In this Coast-to-Coast presentation, the elementary ideas behind this simple FFT-based approach to PDE computations on the sphere will be illustrated with a series of MATLAB demo codes. The codes will be made available prior to the seminar, and participants are highly encouraged to run their own tests during the session. The fast algorithms will allow reasonably resolved computations to execute in minutes on even a basic laptop computer (with the MATLAB software installed). This spectral method is applied to several examples of diffusion-driven dynamics in models of pattern formation.

Coast to Coast Seminar Series: Live from Halifax "Artistic Image Processing"

Dalhousie University

Date: Dec 09, 2008
Time: 11:30 - 12:30
Room: ASB10901

Abstract

Non-photorealistic or 'artistic' rendering is a branch of computer graphics that aims to mimic styles of art algorithmically. Non-photorealistic rendering (NPR) takes its inspiration from historic works created in the natural media of painting, drawing and illustration, and can be made to operate directly on 3D geometry or on existing 2D images. Given the complexity of the task, NPR systems typically focus on the replication of a single artistic style. In this talk I discuss two techniques for artistic image processing, both of which have recently appeared in IEEE Transactions on Visualization and Computer Graphics. The first of which, entitled "Mixed Media Painting and Portraiture" presents a technique to transform digital images into renderings that approximate the appearance of mixed media artwork, which incorporates two or more traditional visual media. This is achieved by first separating an input image into distinct regions based on the detail present in the image. Each region is then processed independently with a user-selected NPR filter. This allows the user to treat highly detailed regions differently from regions of low frequency content. The separately processed regions are then blended in the gradient domain. In addition, the work is extended to the rendering of mixed media portraits. Portraits pose unique challenges that we address with a method of segmentation based on a composite of face detection and image detail. The approach offers the user a great deal of flexibility over the end result, while at the same time requiring very little input. The second approach, "Image-Based Stained Glass", attempts to simulate the appearance of stained glass artwork. A stained glass window possesses a distinctive style partly due to the unique color ranges produced through the interaction of color enamels, glass and light. The imposition of lead calmes further separates the appearance of stained glass from other mediums. To simulate this, a novel approach has been developed which involves image warping, segmentation, querying, and colorization along with texture synthesis. In the method, a given input image is first segmented. Each segment is subsequently transformed to match real segments of stained glass queried from a database of image exemplars. By using real sources of stained glass, the method produces high quality results and, again, requires only modest amounts of user interaction.

Bio

Stephen Brooks is an Assistant Professor in the Faculty of Computer Science at Dalhousie University. He received a Ph.D. in Computer Science from the University of Cambridge in 2004, a M.Sc. from the University of British Columbia in 2000 and a B.Sc. from Brock University in 1998. He is a member of IEEE, ACM, and Canadian Information Processing Society (CIPS). His research interests include Computer Graphics, Visualization and 3D GIS, and is a co-founder of the GVLab (www.gvlab.ca). His primary interest outside academia, visual arts, complements and drives his interest in non-photorealistic computer graphics. In parallel with his academic work, he enjoys engaging in life drawing with graphite pencil, charcoal pencil and conté crayon.

Coast to Coast Seminar Series: Live from Edmonton "Advanced Collaborative Infrastructure for Real-Time Computational Steering in Scientific Computing"

University of Alberta

Date: Nov 25, 2008
Time: 11:30 - 12:30
Room: ASB10901

Abstract

Advances in computer processing power and networking over the past few years have brought a significant change to the modeling and simulation of complex phenomena. Problems that formerly could only be tackled in batch mode, with their results visualized afterwards, can now be monitored whilst in progress using graphical means, in certain cases it is even possible to alter parameters of the computation whilst it is running, depending on what the scientist sees in the current visual output. This ability to monitor and change parameters of the computational process at any time and from anywhere is called computational steering. By combining this capability with advanced communications tools, like the Access Grid, over high-speed network it is now possible for a group of scientists located across various continents to work collaboratively on simulations allowing them to compare ideas and to share their experience. This is a key advance as the notion of a scientist working alone in his laboratory is disappearing, as scientific problems get larger and more complex. At the University of Alberta numerous scientific projects are already using this technology, to name a few: The Virtual Wind Tunnel Project at the Computer Science Department Project CyberCell at the Institute for Bio-molecular Design The Earth Core Simulation at the Department of Physics Simulation of Subatomic Physics at the Department of Physics. Many of these projects use high performance computing facilities such the one provided by the WestGrid infrastructure. Some projects are very close to the final goal of a truly interactive simulator and some are planning to do so. Form our experience collaborating with various Departments it is clear that there is a great need for such systems and for an infrastructure capable of providing this capability. During this presentation, we will present the current status in the development of these facilities and share our experience gained so far.

Bio

Dr. Boulanger worked for 18 years at the National Research Council of Canada as a senior research officer where his primary research interest was in 3D computer vision, rapid product development, and virtualized reality systems. He now has a double appointment as a professor at the University of Alberta Department of Computing Science and at the Department of Radiology and Diagnostic Imaging. His main research topic and teaching is on virtualized reality systems. He is also principal investigator for new media at TRLabs. In 2004, Dr. Boulanger was awarded an iCORE/TRLabs industrial chair in Collaborative Virtual Environment. He has published more than 200 scientific papers in various Journals and Conferences. He is on the editorial board of two major academic journals. Dr. Boulanger is also on many international committees and frequently gives lectures on rapid product development and virtualized reality. He is the Director of the Advanced Man Machine Interface Laboratory. He is also the scientific director of the Alberta Radiological Visualization Center. On the commercial side, Dr Boulanger is the president of PROTEUS Consulting Inc., an Alberta-based consulting firm specialized in Virtual Reality Applications.

Coast to Coast Seminar Series: Live from St. John's, Newfoundland "Real-time Foreground Segmentation from Dynamic Backgrounds on GPU"

Memorial University

Date: Oct 28, 2008
Time: 11:30 - 12:30
Room: ASB10900

Abstract

This talk discusses the problem of foreground separation from the background modeling perspective. In particular, we deal with the difficult scenarios where the background texture might change spatially and temporally. A novel approach is proposed that incorporates a pixel-based online learning method to adapt to temporal background changes promptly, together with a graph cuts method to propagate per-pixel evaluation results over nearby pixels. Empirical experiments on a variety of datasets demonstrate the competitiveness of the proposed approach, which is also able to work in real-time on the Graphics Processing Unit (GPU) of programmable graphics cards.

Coast to Coast Seminar Series: Live from the Calgary, Alberta "Computer Visualization in Urban Planning and Development"

University of Calgary

Date: Oct 14, 2008
Time: 11:30 - 12:30
Room: ASB10901

Abstract

The use of computer modeling in planning can help communities understanding the role of density, building mass and architectural character in creating urban space. In this presentation case studies will illustrate how urban planners, community groups and developers can use computer visualization to resolve conflict and promote understanding about urban form and development.

Bio

Dr. Levy is a Professor of Planning and Urban Design at The University of Calgary, where he serves as the Director of the Program in Real Estate Development. Since 1996, Dr. Levy has also served as Director of Computing for the Faculty of EVDS. Dr. Levy is a founding member of the Virtual Reality Lab. Dr. Levy speaks at international and national conferences in the fields of virtual reality, 3D imaging, education, archaeology and planning. His published work appears in journals such as Internet Archaeology, IEEE MultiMedia, Journal of Visual Studies, Environment and Planning and Plan Canada.

Coast to Coast Seminar Series: Live from Hamilton, Ontario "Visualization of Reciprocal Space - 3D X-ray Diffraction"

McMaster University

Date: Sep 30, 2008
Time: 11:30 - 12:30
Room: ASB10901

Abstract

Crystallographers have been collecting diffraction data for the characterization of single crystals and polycrystalline solids for many years. We rotate the samples in an X-ray beam and use 2D detectors to collect a series of images, generating gigabytes of 3D data for mathematical analyses. Using the MAX3D software developed at McMaster we are finally able to visualize these data as a volume objects in reciprocal space. In this presentation, we will briefly define reciprocal space and its relation to direct space, structural information content, and the experimental 2D images. We will provide a variety of examples from Chemistry, Physics, and Engineering of how the visualization of the complete data sets leads to a better understanding of the samples, the experiments, and sometimes yields more information than expected. Single crystal diffraction generates a lattice of Bragg diffraction spots of various intensities in reciprocal space, and a Fourier transform with appropriate phasing reveals the molecular or solid state structure in direct space. A clean diffraction pattern requires a regular three dimensional repeat pattern in the crystal. Short range 1D or 2D ordering in certain crystals results in diffuse 2D or 1D diffraction features. These are difficult to recognize without the ability to see the full 3D diffraction pattern. Polycrystalline solids such as ceramics, alloys, thin films, and even polymers have physical properties which depend on the size and preferred orientation of the micro- or nano-crystal domains. Visualization of the intensity distribution on the diffraction spheres is the most efficient way to follow changes in a material. The 3D images of reciprocal space also serve as excellent teaching aids for crystallographic theory and practice. This is a joint work with Weiguang Guan, High Performance and Research Computing Support, McMaster University, Hamilton, ON.

Bio

Dr. Jim Britten is an Assistant Professor in the Chemistry Department at McMaster University and Manager of the McMaster Analytical X-ray (MAX) Diffraction Facility, a joint operation with the Brockhouse Institute for Materials Research. He has been a crystallographer for 25 years, with over a hundred publications. He is a Council member of the American Crystallographic Association and program chair for the 2009 ACA meeting in Toronto. He is Vice-Chair of the Canadian National Committee for Crystallography, a member of the International Program Committee for the 2011 International Union of Crystallography Congress in Madrid, and Chair of the IPC for the 2014 IUCr meeting in Montreal. His collaboration with visualization programmer Weiguang Guan over the past two years has lead to the MAX3D package.

Coast to Coast Seminar Series: Live from Burnaby, British Columbia "Optimal regular sampling and reconstruction in three dimensions"

Simon Fraser University

Date: Sep 16, 2008
Time: 11:30 - 12:30
Room: ASB10900

Abstract

Efficient and accurate sampled representation of continuous data is the foundation of computational science. The study of sampling and interpolation goes back more than 2000 years. The focus of that body of work has mostly been on one dimensional signals; higher dimensional signals have been dealt with in a separable manner, i.e. the x, y, and z-axis are treated independently. This led to the introduction of the Cartesian lattice. The ease of comprehension as well as a simplified algorithmic treatment have been convincing arguments such that even today the Cartesian lattice is ubiquitous in dealing with multi-dimensional data. In contrast, we know at least since Kepler that there are more efficient structures for the representation of multi-dimensional signals. In this talk I focus on the body-centered and face-centered cubic lattices in three dimensions. I describe these lattices and highlight features, that make them suitable for sampled data representations. Since sampling lattices represent continuous phenomena, the efficient interpolation within such lattices is key to their successful adoption in the computational science. I present advances in interpolation techniques on body-centered cubic lattices that allow me to argue for their superiority in practical applications over the Cartesian approach. In particular in graphics and visualization the visual appearance and perception of sampled 3D phenomena is of great importance. I present research that shows that said lattices are also superior in visual perception over traditional Cartesian lattices. I conclude my talk with an outlook on our current research on data acquisition on such lattices in the medical domain using iterative reconstruction methods as well as in the computational domain using the Lattice-Boltzmann method for solving partial differential equations. This talk is a summary of a number of years of research together with my students and collaborators. I'd like to especially acknowledge the contributions of Alireza Entezari who is currently an Assistant Professor at the University of Florida.

Bio

Torsten Möller is an associate professor at the School of Computing Science at Simon Fraser University. He received his PhD in Computer and Information Science from Ohio State University in 1999 and a Vordiplom (BSc) in mathematical computer science from Humboldt University of Berlin, Germany. He is a member of IEEE, ACM, Eurographics, and Canadian Information Processing Society (CIPS). His research interests include the fields of Visualization and Computer Graphics, especially the mathematical foundations thereof. He is the director of Vivarium, co-director of the Graphics, Usability and Visualization Lab (GrUVi) and serves on the Board of Advisors for the Centre for Scientific Computing at Simon Fraser University. He is the appointed Vice Chair for Publications of the IEEE Visualization and Graphics Technical Committee (VGTC). He has served on a number of program committees (including the Eurographics and IEEE Visualization conferences) and has been papers cochair for EuroVis, Graphics Interface, and the Workshop on Volume Graphics as well as the Visualization track of the 2007 International Symposium on Visual Computing. He has also co-organized the 2004 Workshop on Mathematical Foundations of Scientific Visualization, Computer Graphics, and Massive Data Exploration at the Banff International Research Station. He is currently serving on the steering committee of the Symposium on Volume Graphics. Further, he is an associate editor for the IEEE Transactions on Visualization and Computer Graphics (TVCG) as well as the Computer Graphics Forum.

Coast to Coast Seminar Series: Live from Halifax, Nova Scotia "Cache-Oblivious Geometric Algorithms"

Dalhousie University

Date: Apr 01, 2008
Time: 11:30 - 12:30
Room: ASB10900

Abstract

Apart from main memory and disk drives, modern computers are equipped with multiple levels of cache, in order to bridge the gap between the CPU's processing speed and the access latency of main memory. Yet, traditional algorithms are not designed to take advantage of cache memory, making this strategy quite ineffective. The cache-oblivious model elegantly bridges the gap between traditional models of computation and the real world of multilevel memory hierarchies, as it allows us to design algorithms in the traditional way, yet obtain algorithms that optimally use all levels of cache that are present in the computer. Abstractly, the idea behind cache-oblivious algorithms is to consider the memory to be one big array and to lay out the data in this array so that items that are accessed in short sequence are close to each other in this array. This talk will give a short introduction into techniques to achieve this and then move on to discussing solutions and challenges in designing such algorithms for geometric problems. This talk will focus mostly on constructing data structures for efficient range searching in the plane and, if there is time, also touch briefly on finding intersections between line segments in the plane. For range searching, the problem can be posed very naturally as finding a short sequence with certain locality properties, thereby completely abstracting away the computational process and giving rise to interesting questions in extremal combinatorics.

Coast to Coast Seminar Series: Live from Vancouver, British Columbia "A Broad Empirical Study of IT Security Practioners"

University of British Columbia

Date: Mar 18, 2008
Time: 11:30 - 12:30
Room: ASB10900

Abstract

Security of information technology (IT) has become a critical issue for organizations as they must protect their information assets from unauthorized access and quickly resume business activities after a security breach. In order for technological solutions to provide effective support to IT security practitioners, tool developers need to understand better not only the technical, but also the human and organizational dimensions of IT security. To date, there is little empirical evidence about how human, organizational, and technological factors impact the processes of managing IT security. Moreover, little is known about the responsibilities and roles of security practitioners or the effectiveness of their tools and security management practices. The Human, Organization, and Technology Centred Improvement of IT Security Administration (HOT Admin) research project is working to fill this gap.

Coast to Coast Seminar Series: Live from Halifax, Nova Scotia "Classification in Genetic Programming: a Cooperative - Competitive Coevolution Approach"

Dalhousie University

Date: Mar 04, 2008
Time: 11:30 - 12:30
Room: ASB10900

Abstract

The method of Pareto dominance is increasingly being used within the context of coevolutionary approaches to Genetic Programming (GP). GP is a machine learning approach based on a neo-Darwinian metaphor for resolving the credit assignment problem. Coevolution provides a mechanism for establishing engagement between learner and domain; or resolving interactions between models with different behavioral contributions, thus problem decomposition. Pareto dominance has come to the fore as a formal mechanism for aiding both of these coevolutionary endeavors. In this presentation we will detail an approach to model building for the classification domain such that the Pareto coevolutionary scheme facilitates scalability to large data sets and acts as a natural mechanism for problem decomposition among cooperating classifiers. Specific comparisons will be made with classical machine learning algorithms and other GP classifiers.

Coast to Coast Seminar Series: Live from Winnipeg, Manitoba "Ramsey Theory and the Infinite"

University of Manitoba

Date: Feb 19, 2008
Time: 11:30 - 12:30
Room: ASB10901

Abstract

This talk is an invitation to infinite Ramsey theory, accessible to most mathematicians. Most combinatorists are familiar with Ramsey theory regarding finite structures, and many are aware of some infinitary techniques often used to solve Ramsey questions in the finite, for example, ultrafilters, harmonic analysis, and ergodic techniques. However, it seems that few combinatorists are familiar with much infinite Ramsey theory. On the other hand, topologists, analysts, and set theorists seem to regularly use Ramsey theory in the infinite, but it seems that only a few basic theorems find application. I survey some infinite Ramsey-type theorems (with few or no proofs) and hope to reveal some surprises. One surprise to me is that, often, topology is required to refine an infinite Ramsey-type statement before finding a proof for that statement. My expertise is not infinite Ramsey theory, and I claim no expertise in topology, but I hope to bring the audience to the point where it is clear that topology might help, or even be required, to further advance the field of Ramsey theory. For those not familiar with Ramsey theory, a typical theorem has the form: for any r (number of colours), H (small structure or set) and G (medium), there exists a (large) F so that for any r-colouring of the H-substructures of F, there exists a G-substructure in F all of whose H-substructures are monochromatic. For example, the pigeonhole principle is such a theorem (where H is a single vertex). In Ramsey's original theorem, r is finite, H, G, and F are simply sets, where H is finite, and G is finite or countably infinite.

Coast to Coast Seminar Series: Live from Fredricton, New Brunswick "Noncommutative Surfaces"

University of New Brunswick

Date: Feb 05, 2008
Time: 11:30 - 12:30
Room: ASB10901

Abstract

Dr. Colin Ingalls is from the Department of of Mathematics and Statistics, University of New Brunswick We will give several examples of noncommutative surfaces and present their classification for noncommutative surfaces which are finite over their centres. We will also discuss interactions with algebraic geometry and noncommutative algebra.

Coast to Coast Seminar Series: Live from Burnaby, British Columbia "Algorithmic problems in biomolecular network analysis"

Simon Fraser University

Date: Jan 22, 2008
Time: 11:30 - 12:30
Room: ASB10900

Abstract

As biomolecular networks, and in particular protein-protein interaction networks, become more and more available it becomes of significant interest to emulate them via random processes, to compare them and detect their similarities under various measures, to determine some of their interesting topological features and to identify network motifs of interest. All of these problems are computationally hard and thus require novel algorithmic strategies towards their resolution. In this talk, we will discuss some of these algorithmic challenges and hopefully present a few success stories.

Coast to Coast Seminar Series: Live from St. John's, Newfoundland "Existential Closure and BIBD Block-Intersection Graphs"

Memorial University of Newfoundland

Date: Dec 04, 2007
Time: 11:30 - 12:30
Room: ASB10900

Abstract

A graph G with vertex set V is said to be n-existentially closed (or n-e.c. for short) if, for every proper subset S of V with |S|=n and every subset T of S, there exists a vertex x in V-S such that x is adjacent to each vertex of T but is adjacent to no vertex of S-T. A balanced incomplete block design (BIBD) with parameters (v,k,lambda) consists of a set of blocks, each of which is a k-subset of a set V of cardinality v, such that each 2-subset of V occurs in precisely lambda of the blocks of the design. Given a combinatorial design D with block set B, its block-intersection graph is the graph having B as its vertex set, such that two vertices b_1 and b_2 are adjacent if and only if b_1 and b_2 have non-empty intersection. In this talk we will present some recent results concerning balanced incomplete block designs (BIBDs) and when their block-intersection graphs are n-existentially closed. This is joint work with Neil A. McKay.

Coast to Coast Seminar Series: Live from Vancouver, British Columbia "Computational challenges in prediction and design of nucleic acid structure"

University of British Columbia

Date: Nov 20, 2007
Time: 11:30 - 12:30
Room: ASB10901

Abstract

RNA molecules are increasingly in the spotlight, in recognition of the important roles they are now known to play in our cells and their promise in therapeutics. Function follows form in the molecular world, and so our ability to understand RNA function is enhanced by reliable means for predicting RNA structure. Outside of the cell, exotic DNA structures are now finding use in the construction of biosensors, nanotubes, lattices and much more, motivating the need for DNA structure prediction, as well as design of DNA sequences that fold to specific structures. This talk will provide an overview of successes and challenges of physically motivated algorithms for prediction and design of nucleic acid (RNA and DNA) secondary structure.

Coast to Coast Seminar Series: Live from Wolfville, Nova Scotia "Supervised Learning via Bayesian Computation"

Hugh Chipman

Date: Nov 06, 2007
Time: 11:30 - 12:30
Room: ASB10900

Abstract

The last quarter-century has seen an explosion of flexible models invented by statisticians and machine learners. Increasing computing power and advances in learning algorithms have made it possible to fit such sophisticated models to large and complex data sets. At the same time, there have been breakthroughs in computational methods for Bayesian statistics, notably Markov chain Monte Carlo methods. In this talk I'll outline some of the ways that Bayesian methods can be used to learn complicated models from data. Specific examples such as decision trees and ensemble models will be used to illustrate particular issues, including the extent to which statistical inference is possible with complex models, the role of prior information and its ability to regularize estimated models, and how a statistical approach can enrich what would otherwise just be "algorithms for learning from data".

Coast to Coast Seminar Series: Live from Vancouver, British Columbia "Computational Challenges in Sensorimotor Biology"

University of British Columbia

Date: Oct 23, 2007
Time: 11:30 - 12:30
Room: ASB10900

Abstract

Humans and other animals utilize spectacularly sophisticated sensorimotor systems to perceive and interact with their dynamic environment. Computational models are essential for understanding how these complex systems actually work. I will describe recent work in my group on building mathematical and computational models of these systems, focusing on three topics: (1) Numerical simulation of complex three dimensional musculoskeletal systems with neural activation; (2) Reconstruction of subject specific models from medical images and other measurements; and (3) Modeling the human hand and its interaction with external objects.

Coast to Coast Seminar Series - Special Session: Live From Halifax, Nova Scotia "A Chinese Prouhet-Tarry-Escott Solution"

The Open University, UK

Date: Oct 12, 2007
Time: 11:30 - 12:30
Room: ASB10908

Abstract

Jens Kruse Andersen recently set the challenge of finding complete factorizations of consecutive integers with more than 500 decimal digits. I was able to set records with up to 10 consecutive factorizations by using solutions of the ideal Prouhet-Tarry-Escott (PTE) problem, which is equivalent to finding polynomials with integer roots that differ only by an integer. PTE solutions with degrees up to s = 10 were known by 1944, but the problem with s > 10 had received only one solution, found almost inadvertently in 1999. It seemed to me that the ideal PTE problem might benefit from use of the Chinese remainder theorem. I shall describe how a new solution was found at degree s = 12 by the discipline of splitting the problem into a smart part that an be handled by Pari-GP and a brute force part that benefits from parsimonious ForTran programming.

Coast to Coast Seminar Series: Live from Halifax, Nova Scotia "Modelling self-organizing networks with a hidden metric"

Dalhousie University

Date: Oct 09, 2007
Time: 11:30 - 12:30
Room: ASB10900

Abstract

Current models for complex networks mainly aim to reproduce a number of graph properties observed in real-world networks. On the other hand, experimental and heuristic treatments of real-life networks operate under the tacit assumption that the network is a visible manifestation of an underlying hidden reality. For example, it is commonly assumed that communities in a social network can be recognized as densely linked subgraphs, or that Web pages with many common neighbours contain related topics. Such assumptions apply that there is an a priori "community structure" or "relatedness measure" of the nodes, which is reflected by the link structure of the graph. A common method to represent "relatedness" of objects is by an embedding in a metric space, so that related objects are placed close together, and communities are represented by clusters of points. In this talk, I will discuss graph models where the nodes correspond to points in space, and the stochastic process forming the graph is influenced by the position of the nodes in the space. The work presented was done in collaboration with william Aiello, Anthony Bonato, Colin Cooper, and Pawel Pralat.

Coast to Coast Seminar Series: Live from Burnaby, British Columbia "Spectra of (3,6)-Fullerenes"

Simon Fraser University

Date: Sep 25, 2007
Time: 11:30 - 12:30
Room: ASB10900

Abstract

A (3,6)-Fullerene is a 3-regular planar graph whose faces are triangles and hexagons. Being variants of Buckyballs, these graphs are of interest to chemists. It was conjectured (P.~Fowler, 1995) that the spectrum of any (3,6)-Fullerene consists of opposite real pairs { \pm \lambda }, and four (unpaired) exceptional eigenvalues { 3, -1, -1, -1 }. We prove this conjecture (and more) by expressing every (3,6)-Fullerene as a Cayley sum graph, a variant of Cayley graph which was introduced by Ben Green in 2003. This is joint work with Matt DeVos, Robert Samal, and Bojan Mohar.

Coast to Coast Seminar Series: Live from Fredricton, New Brunswick "Halfspace depth: motivation, computation, optimization"

University of New Brunswick

Date: Mar 27, 2007
Time: 11:30 - 12:30
Room: ASB10901

Abstract

The halfspace depth of a point p with respect to a set of points S is the minimum of over all closed halfspaces h with boundary through p, of |h n P |. In this talk I will discuss a motivating example from computational statistics due to Ivan Mizera, and briefly review a method of computing halfspace depth (due to myself, Komei Fukuda, and Vera Rosta) based on discretization of the space of all hyperplanes through P . I will finish by discussing faster, but more memory intensive methods based on branch-and-cut. Several authors have worked on similar methods; the latest being myself and Dan Chen. Compared with some of the earlier formulations we take advantage of more of the geometry of the problem. Unfortunately the transition from integer program to mixed integer program introduces some numerical complications; in particular it introduces "arbitrarily large" and "arbitrarily small" constants into the formulation. Eliminating this numerical unpleasantness is an interesting open (as far as I know) problem.

Coast to Coast Seminar Series: Live From Burnaby, British Columbia "List Homomorphisms, Minimum Cost Homomorphisms, Interval and Circular Arc Graphs"

Simon Fraser University

Date: Mar 13, 2007
Time: 11:30 - 12:30
Room: ASB10900

Abstract

In this introductory talk, I will discuss a connection between structural characterizations of some natural and traditional graph classes (such as those mentioned in the title), and certain complexity questions arising in the study of graph homomorphisms, or more generally constraint satisfaction problems. In particular, I will relate dichotomy classifications of such homomorphism problems to structural characterizations of the corresponding graph classes.

Coast to Coast Seminar Series: Live from Clemson, South Carolina "Recounting the rationals"

Clemson University

Date: Feb 27, 2007
Time: 11:30 - 12:30
Room: ASB10900

Abstract

It is well known that the rationals are countable, that is, that there is a bijection from the non-negative integers to the rational numbers: as simple as the standard proof of this fact is, computationally it is remarkably mysterious. Indeed, at the moment, it is difficult to list, say, the 10^100th rational, and with current technology andalgorithms, impossible to give the 10^{300}th rational. In joint work with Herbert S. Wilf, we give an alternate enumeration of the positive rationals: after a detour via generating functions, restricted partitions and continuous nowhere differentiable functions, we will discuss computational advantages of our enumeration. In particular, we will give the last few digits of the numerator and denominator of the 10^{1000}th rational.

Coast to Coast Seminar Series: Live From Vancouver, British Columbia "Advances in Human Computer Interactions"

University of British Columbia

Date: Feb 13, 2007
Time: 11:30 - 12:30
Room: ASB10900

Abstract

Communications and computing technology is advancing at an accelerated pace. Humans are finding it difficult to keep pace with these changes, and yet these new technologies are supposedly made for the benefit of humans. Dr. Fels will discuss some of the projects that are pursuing advances in Human Computer Interaction. Specifically, he will introduce the following projects on modeling of the human vocal tract, multi-camera systems for ubiquitous computing, and some work on art and technology.

Coast to Coast Seminar Series: Live from St. John's, Newfoundland "The Math Plague: Learning Strategies for Under-Achievers in Mathematics"

Memorial University

Date: Jan 30, 2007
Time: 11:30 - 12:30
Room: ASB10900

Abstract

This presentation is anecdotal but the knowledge on which it is based is not. As a researcher at Memorial University, Dr. Mantyka has had the opportunity to study the learning problems of under-achievers in mathematics collaboratively with psychologists, teachers, and students, to do statistical analysis on large data sets, and to conduct controlled experiments. However, under-achievers have no interest in sophisticated research; they simply want to be able to successfully complete their required courses in mathematics. This has led us to also develop bridging strategies as motivation to introduce these new learning strategies to students. The presentation will contain a sample of some common learning problems which have been identified, learning strategies to remedy these problems, and bridging strategies to motivate the student to use these new techniques. Many of the learning principles discussed in The Math Plague are applicable to any discipline and to any endeavor. Examples of this and justification for this statement will be provided.

Coast to Coast Seminar Series: Live from Lethbridge, Albertat "Symmetrically decomposable symmetric designs"

Department of Mathematics & Computer Sciences, University of Lethbridge

Date: Jan 16, 2007
Time: 11:30 - 12:30
Room: ASB10900

Abstract

A symmetric design is called {it symmetrically decomposable} if its incidence matrix can be decomposed into blocks with each block being a zero matrix or an incidence matrix of a smaller symmetric design. The existence of such designs and a few related objects such as Balanced Generalized Weighing matrices and Bush-type Hadamard matrices will be discussed.

Coast to Coast Seminar Series: Modeling Deep Ocean Currents

Date: Nov 28, 2006
Time: 11:30 -
Room: ASB10901

Abstract

On the planetary-scale, because of the enormous inertia and vast heat content as compared to the atmosphere, Earth's oceans act as the 'memory' and 'integrator' of past and evolving climate states. The ocean circulation is comprised of the surface-intensified wind-driven currents that act to transport warm equatorial waters toward the polar regions, and the abyssal, or deep, density-driven currents that transport cooler polar waters back toward the equator. Together, these two current systems describe the large-scale convective overturning of the oceans. The stability and time evolution of this global circulation pattern is critical in understanding climate variability and global change. Perhaps surprisingly, the present generation of the most sophisticated numerical ocean climate models does not adequately describe the observed spatial structure and temporal development of these abyssal currents. This overview talk, which will hopefully remain accessible to all, will describe efforts to develop a theory for the initation, dynamical evolution and maintenance of abyssal ocean currents. Various themes in modern applied mathematics will be touched on, including physical mathematical modelling, asymptotic reduction, computational fluid dynamics, Hamiltonian partial differential equations and hydrodynamic stability theory.

Coast to Coast Seminar Series: Privacy Protection in Large Data Repositories

Head of the Department of Computer Science, University of Calgary

Date: Nov 21, 2006
Time: 11:15 -
Room: ASB10900

Abstract

The advent of large data warehouse systems and their associated data mining activities aimed at discovering 'new' information has substantially increased the threat to personal privacy. Data suppliers often provide their data for a specific purpose but there are no guarantees available to ensure that it will only be used for that purpose. Ultimately a goulash of legislative, enforcement, and technical solutions will be required to ensure that data is only used for its intended purpose. This talk will provide a snapshot of the current work in privacy protection in data repositories (both within the database management system and in the data miner), describe some ongoing research to capture more acccurate data by providing privacy promises, and provide some initial insights into formalizing a privacy model. Ken Barker is a Professor of Computer Science at the University of Calgary with particular expertise in the area of database management systems. He is the Head of the Department of Computer Science. In addition to holding a Ph.D. in Computing science from the University of Alberta (1990) he has nearly 25 years of experience working with industrial computer systems, fifteen years of consulting experience in the design of commercial databases, and a particular interest in system integration and distributed systems. As the director of a research laboratory at the Universities of Calgary and Manitoba, he has supervised over 50 graduate students and currently leads a lab consisting of 30 faculty, post-docs, graduate students, and research assistants. Dr. Barker has published in excess of 175 peer reviewed publications in areas as diverse as distributed systems, software engineering, transaction systems, simulations and security. He has delivered over twenty industrial based courses on topics such as distributed database systems, datawarehousing, system integration, Unix, and XML.

Coast to Coast Seminar Series: Live from Antigonish, Nova Scotia "Scalable Integer Factorization for Public Key Cryptosystems"

Department of Computer Science, St. Francis Xavier University

Date: Nov 07, 2006
Time: 11:15 -
Room: ASB10900

Abstract

Currently global data and communication networks are used not only as a way for scientists and researchers around the world to share ideas and information but also as an increasingly effective way for businesses, financial institutions and government organizations to communicate and engage in commercial activities. Therefore, currently communication and network security is becoming an extremely important area of product research and development. The integer factorization and discrete logarithm problems are of practical importance because of the widespread use of public key cryptosystems whose security depends on the presumed difficulty of solving these problems. For example, there is no known deterministic or randomized polynomial time algorithm for finding a factor of a given composite integer. If a fast integer factorization could be implemented, then the most popular algorithm of public key cryptography, the RSA algortithm would be insecure. In this presentation, some of our most recent advances on solving integer factoriation on high performance computer architectures will be reported and discussed

Coast to Coast Seminar Series "Notes from the Digital Trenches"

Dalhousie University

Date: Oct 10, 2006
Time: 12:15 -
Room: ASB10900

Abstract

For the past eight years the International Mathematical Union has had a Committee on Electronic Information and Communication (www.ceic.math.ca). This is perhaps the only such global academic committee. I have been chair or deputy chair of the CEIC for six years. In this talk I'll disucss and illustrate what we have learned and effected on issues like: intellectual-property rights; copyright; journal ownership and pricing; retro-digitization; federated searching.

Coast to Coast Seminar Series

Schrum Chair in Science, SFU

Date: Sep 26, 2006
Time: 11:15 -
Room: ASB10900

Abstract

A basic problem for studies of networks or graphs concerns how to obtain a sample of nodes and links from the network and use the values of those nodes and links to infer characteristics of the larger network of interest. For example, in HIV/AIDS studies of hidden, at-risk populations such as injecting drug users and commercial sex workers, often the most practical way to obtain a sample of respondents involves finding some initial members of the hidden population and following social links to add other members to the sample. In this talk I will describe a number of recent developments in sampling in networks together with approaches to inference from the sample data to the wider network. Dr. Steve Thompson's recent research has involved new adaptive designs for sampling populations that are elusive, rare, uneven, or hard to detect. Some of this work is focused on sampling in networks and some on sampling in spatial settings. Most recently Dr. Thopmson worked on sampling designs in which the sampling units are in motion. A lot of his spatial sampling work has been motivated by problems in environmental studies, including ecological surveys of rare, clustered animal and plant species. The network sampling work has been motivated by problems in studies of hidden human populations such as those at high risk for HIV/AIDS, including injecting drug users and commercial sex workers. The designs in which the sampling units move were motivated by the problem of placing sensors to detect harmful microorganisms in the atmosphere.

Coast to Coast Conference on Mathematics of Computation

Simon Fraser University - University of Calgary - Dalhousie University

Date: Aug 05, 2006
Time: 08:00 -
Room: ASB10900

Abstract

Schedule 9:00PT/10:00MT/1:00AT Anthony Bonato, Wilfrid Laurier University 9:45PT/10:45MT/1:45AT Nils Bruin, Simon Fraser University 10:30PT/11:30MT/2:30AT Robert Deupree, Saint Maryis University 11:45PT/12:45MT/3:45AT Faramarz Samavati, University of Calgary 12:30PT/1:30MT/4:30AT Herre Wiersma, Dalhousie University 1:15PT/2:15MT/5:15AT Colin Percival, Simon Fraser University Nils Bruin, Simon Fraser University Title: Deciding the existence of rational points on curves While it is known that Hilberts 10th problem - deciding whether a polynomial equation has integral solutions - has no automatic solution, one can still hope that for subclasses of polynomial equations and for rational solutions, such an algorithm might exist. Recently, experiments and theoretical work inspired by these experiments have provided some quite convincing evidence that for rational points on projective curves, such an algorithm does indeed exist and that we in fact already know the algorithm. I will outline this algorithm and indicate the heuristics that indicate it is correct. Anthony Bonato, Wilfred Laurier University Title: Modeling the Infinite Web The web graph is an important example of a self-organizing real-world network. Despite the computational challenges that emerge due to its massive size, researchers have discovered key properties underlying the evolution of this network. We now know that these properties - such as a power law degree distribution and small world topology - are ubiquitous in technological, biological and social networks. A large body of research on self-organizing networks now exists and spans many disciplines. Much recent research has focused on modeling self-organizing networks in an attempt to simulate and predict their observed properties. We will give an introduction to the web graph and related networks, and describe rigorous models for their evolution. Many researchers now view the web graph as infinite. We will describe some new research reconciling the theory of self-organizing networks with infinite graphs. Robert Deupree, Director, Institute for Computational Astrophysics, Saint Maris University Title: Numerical Solution of Rotating Stars The numerical solution of spherically symmetric stars was one of the first non-defense oriented computational successes. The method used is finite difference technique, in which the star is divided up into mass shells and differentials replaced by differences between adjacent mass shells. Its success was so great that only now are we beginning to produce multidimensional models which can rigorously include rotation, magnetic fields, and instabilities capable of mixing material in various parts of the star. I will briefly review how we know these exist. It turns out that most of the important decisions must be taken before hand is placed on keyboard, and I will outline the logic that led me to the finite difference framework I have adopted. Colin Percival, Simon Fraser University Title: Rounding Errors in Complex Floating-Point Multiplication While rounding errors in floating-point arithmetic are generally well understood, surprisingly little attention has been paid to rounding errors occurring in the course of performing floating-point arithmetic of complex values. I will show that the trivial bound $sqrt{8} epsilon$ on the relative rounding error in complex floating-point arithmetic can be replaced by $sqrt{5} epsilon$, and further demonstrate that this bound is effectively optimal by constructing the worst-case inputs in base-$2$ floating-point arithmetic systems. Finally, I will explain the important role which computation played in the discovery of these results. Faramarz Samavati, University of Calgary Title: Constructing Multiresolution from Subdivision In computer graphics, both Subdivision and Multiresolution have been widely used. Subdivision is a method for constructing a high quality object from a coarse approximation using some simple refinement rules. Multiresolution provides a hierarchical representation of objects at various levels of detail. In this talk, I present some methods for constructing multiresolution from a given subdivision. These methods are essentially based on two general approaches: reversing subdivision and constraining wavelets. Some practical examples are also presented. Herre Wiersma, Dalhousie University Title: Reviewing the Digital Library of Mathematical Functions The massive project to construct the Digital Library of Mathematical Functions (DLMF), the online successor to the classic reference text by Abramowitz and Stegun, is nearing completion of its twin goals of producing an advanced public web site and a traditional book publication. To assess the usability of the web site, and to obtain feedback useful for improving it before its release to the general public, the project leaders arranged for us at DDrive to perform an in-depth critical evaluation of the DLMF. Adapting book content for the Web has been a challenging task. Presenting the DLMF in an interactive medium has entailed many difficult technical and

Coast to Coast Seminar: Live From Halifax

President, MathResources Inc, Halifax, Nova Scotia

Date: Mar 28, 2006
Time: 11:30 -
Room: ASB10900

Abstract

There are a host of Authoring Tools, Content Management solutions and Object Repositories for content. Is there a system available bettern than all the others? When a system is chosen, how does this impact the Intellectual Property rights of the content author? As the teaching and learning process morphs from paper to digital content, what questions should be asked to best protect both the legal and moral rights of the authors? This is intended to be a conversation exploring individual experiences, posing questions and perhaps providing some insight into this rapidly expanding digital universe. MathResources Inc is a Halifax based company focusing on developing robust math software for teachers and students by uniting the expertise of a mathematician, a computer scientist and a publishing executive.

Coast to Coast Seminar: Live from IRMACS

Simon Fraser University

Date: Mar 14, 2006
Time: 11:30 -
Room: ASB10900

TBA

Coast to Coast Seminar: Live from University of Calgary

iCORE and IRC Chair, Department of Computer Science, University of Calgary

Date: Mar 07, 2006
Time: 11:30 -
Room: ASB10900

Abstract

Network traffic measurement is a technique used to collect low-layer packet-level information from an operational network. Analysis of such measurements can offer useful insights into how well a network is working, and how it is being used. Furthermore, by studying such networks over long periods, we can understand how network usage changes over time, and how we might design better networks for the future. While many fundamental characteristics of Internet traffic are now well understood, there are still occasional anomalies that baffle even the most experienced network researchers. In this talk, I will first provide a brief introduction to the tools and methods for network traffic measurement research, and summarize classic results for Internet traffic characterization. The rest of the talk will then describe some of our recent network measurement research activities, highlighting the measurement research activities, highlighting the challenges faced, and identifying some of the anomalous events and emerging trends seen on today's Internet. Examples will include the U of C campus network, commercial networks, wireless networks,as well as P2P and media streaming applications. Dr. Carey Williamson is an iCORE (Informatics Circle of Research Excellence) Professor in the Department of Computer Science at the University of Calgary, specializing in Broadband Wireless Networks, Protocols, Applications, and Performance. He also holds an NSERC/iCORE/TELUS Mobility Industrial Research Chair in Wireless Internet Traffic Modeling. His educational background includes a B.Sc.(Honours) degree in Computer Science from the University of Saskatchewan in 1985, and a Ph.D. in Computer Science from Stanford University in 1992. Dr. Williamson's research interests include Internet protocols, wireless networks, network traffic measurement, workload characterization, network simulation, and Web server performance. He is a member of IEEE and ACM.

Coast to Coast Seminar: Live From Wolfville, Nova Scotia "L-Functions and Arithmetic"

Jeff Hooper

Date: Feb 28, 2006
Time: 11:30 -
Room: ASB10900

Abstract

Since its inception in the mid 19th century, the Riemann zeta function has taken a central role in number theory. On the one hand, it is a single object that encapsulates all the properties of prime factorization of integers, and hence all of arithmetic; on the other hand, it is by definition a complex function, and so tools from function theory can be brought to bear. Throughout the early 20th century, the study of this function branched out into generalizations to larger number fields, and these new zeta functions were found to have similar behavior. Further generalizations were made, both in terms of moving from number fields to more general objects from algebraic geometry, and by allowing group characters into the mix. Nowadays these various functions are all encapsulated under the general term 'L-function'. In this talk, we'll focus attention on one class of these, the so-called Artin L-functions. These functions involve the interplay of number fields and characters of the corresponding Galois groups. In the early-mid 1970s, startling and very deep connections were discovered between properties of Artin L-functions and purely algebraic properties of number fields. While some of these connections are now fairly well understood, the ideas have generated a host of speculation and conjecture, and even the most basic questions are still not all answered. This talk will be aimed at a general (mathematical) audience, and we will certainly not go into all of these ideas.

Coast to Coast Seminar: Live from UBC "Periodic Complexes and Group Actions"

University of British Columbia

Date: Feb 14, 2006
Time: 11:30 -
Room: ASB10900

Abstract

A classical problem in topology is that of characterizing those finite groups that act fixed-point freely on a sphere. In this talk, we will review these results and describe recent work towards extending this to a product of two spheres. The methods we use are a combination of techiques in topology and group theory.

Coast to Coast Seminar: Live from Calgary "Computational Biology of Plants"

Department of Computer Science, University of Calgary

Date: Feb 07, 2006
Time: 11:30 -
Room: ASB10900

Abstract

Plant modeling is an interdisciplinary combination of mathematical formalisms, biological knowledge, computer simulations, and visualization techniques. An important modeling method is based on the theory of Lindenmayer systems (L-systems). At present, L-system models make it possible to: - accurately recreate the structure and development of plants based on experimental data; - simulate plant physiology and the effects of manipulations or different external conditions on plant development; - simulate plants not only in isolation, but also in their ecological contexts. Current research problems include simulation-assisted studies of the molecular basis of plant developement and form. The presentation will survey applications of L-system models across multiple levels of plant organization, from genes to individual plants to plant ecosystems.

Coast to Coast Seminar: Live from Halifax "Teaching OR/MS Using Integrated Computing Systems"

Pinter Consulting Services & Dalhousie University

Date: Jan 31, 2006
Time: 11:30 -
Room: ASB10900

Abstract

Dr. Pinter is the author and editor of several books, numerous other publications in operations research and its applications. Winner of the 2000 INFORMS Computing Society Prize; Global Optimization Vice-Chair of the INFORMS Optimization Section (2002-2004). Editorial board member of the Journal of Global Optimization; the Journal of Applied Mathematics & Decision Sciences; Algorithmic Operations Research; Int. J. of Modeling Identification and Control; and of the websites GAMS Global World and GAMS Performance World. Principal developer of the LGO, AIMMS/LGO, Excel PSP/LGO, GAMS/LGO, Maple Global Optimization Toolbox, MathOptimizer, Math Optimizer Professional, MPL/LGO, and TOMLAB/LGO software products nonlinear (global and local) optimation. First, a concise account of the 2005 INFORMS Teaching Management Science Workshop is provided, to set some pertinent general objectives. This is followed by more specific notes and examples related to using integrated computing systems to teach nonlinear modeling and optimization. This talk is partially based on a forthcoming paper with Dr. Ignacio Castillo (Wilfrid Laurier University) and Dr. Tom Lee (Maplesoft).

Coast to Coast Seminar: Live From Edmonton "Solving Checkers"

University of Alberta

Date: Jan 17, 2006
Time: 11:30 -
Room: ASB10900

Abstract

Dr. Schaeffer led the team that wrote Chinook, the world's strongest American checkers player. He is currently involved in the University of Alberta GAMES group developing computer poker systems. The most famous of these is the Poki poker player which uses Monte Carlo simulation to both simulate and model human opponents. Dr. Schaeffer's group has also developed a computer player called PsOpt (for Pseudo-Optimal) which uses a mixed-strategy Nash equilibrium solution to a reduced form of Texas hold'em poker. Dr. Schaeffer is the author of 5 books. Artificial intelligence has had notable success in building high-performance game-playing programs to compete against the best human players. However, the availability of fast and plentiful machines with large memories and disks creates the possibility of *solving) a game. This has been done before for simple or relatively small games. In this talk, we discuss solving the game of checkers. Checkers is a popular game of skill with a search space of 10^(20) possible positions. This talk reports on our first result. One of the most challenging checkers openings has been solved -- the White Doctor opening is a proven draw (assuming neither side makes a mistake). Solving roughly 50 more openings will result in the game-theoretic value of checkers being determined.

Coast to Coast Seminar - LIVE FROM HALIFAX

Faculty of Computer Science, Dalhousie University

Date: Dec 06, 2005
Time: 11:30 -
Room: ASB10900

Abstract

Inferring phylogenetic relationships between sequences is a difficult and interesting problem. Assuming that there is enough phylogenetic signal in a set of biological sequences to resolve every tree bifurcation, the resulting tree is a representation of the vertical descent history of a gene. As the number of sequences and/or their length grows determination of good phylogenetic trees becomes extremely computational intensive. Jointly with biochemists and biologists, we at the cgmlab (www.cgmLab.org) have been exploring the application of distributed memory clusters to such problems. This talk will describe: - a parallel version of the standard sequential multiple sequence alignment tool CLUSTALW - parallel covSEARCH, an algorithm for protein phylogeny using a maximum likelihood framework - a fixed parameter tractability (FPT) algorithm for identifying 'conflicted sequences' whose removal often drastically improves the quality of the multiple sequence alignment and associated phylogenetic trees. The talk will be a somewhat messy mix of algorithms, biology, and parallel performance evaluation. Joint work with C. Blouin, D. Butt, J. Cheetham, F. Dehne, G. Hickey, U.Stege, P. Taillon

Coast to Coast Seminar "The Inverse Protein Folding Problem"

School of Computing Science, Simon Fraser University

Date: Nov 22, 2005
Time: 11:30 -
Room: ASB10900

Abstract

Inverse Protein Folding (IPF) has the potential to significantly impact future drug design by providing computational tools that aid in the development of novel proteins with specific structural properties. In its most primitive state, IPF is a method of determining an amino acid sequence which takes on a prescribed structure within a specified (natural) environment. Dill proposed the hydrophobic-polar model to study this problem since hydrophobic interactions account for the vast majority of folding forces in the protein. While the problem is NP-complete under this model, even in the 2-D case, we show that it is possible to closely approximate the fold in 2-D and investigate the problem for 3-D. In particular we study the structure of proteins to deduce those lattice which are most amenable to protein folds. Joint work with: C. Mead, J. Manuch, X. Huang, B. Bhattacharyya, L. Stacho

Coast to Coast Seminar Live from Halifax "Fermat Numbers, Wieferich and Wilson Primes: Computations and Generalizations"

Dalhousie University

Date: Nov 08, 2005
Time: 11:30 -
Room: ASB10900

Abstract

In this survey of recent and ongoing computational and theoretical work I report on the following interrelated topics, all dealing with very large integers: Fermat numbers, generalized Fermat numbers, the search for Wieferich and Wilson primes, and Fermat and Wilson tuotients for composite moduli. A number of related topics will also be briefly discussed.

Coast to Coast Seminar "Semantic blueprints of discrete dynamic systems: challenges and needs in computational modeling of complex behaviour"

School of Computing Science, Simon Fraser University

Date: Oct 25, 2005
Time: 11:30 -
Room: ASB10900

Abstract

How can one cope with the notorious problem of establishing the correctness and completeness of abstract functional requirements in the design of control-intensive software systems prior to actually building a system? The answer given here explores abstract state machines (ASMs): a universal mathematical framework for semantic modeling of discrete dynamic systems. Combining common abstraction principles from mathematical and computational logic, ASMs provide a universal model of computation and an effective instrument for analysing and reasoning about complex semantic properties of real-world systems. ASMs have been studied extensively in industry and academia over the last 15 years. Widely recognized applications include semantic foundations of virtually all kinds of architectures, languages and protocols. In this talk, we focus on concurrent and reactive systems in automotive control, e-business, and advanced telecommunication services, and more recent work in computational criminology, safety and security.

Coast to Coast Seminar "Pyrite or gold? It takes more than a pick and a shovel"

Dalhousie University

Date: Oct 11, 2005
Time: 11:30 -
Room: ASB10900

Abstract

Data mining and machine learning techniques have been proposed as a mechanism for detecting malicious activity by examing a number of data streams representing computer and network activity. Although some encouraging results have been obtained, most systems do not deliver in the field what they promised in the lab. There are a number of reasons for this, but the most likely one is the failure of the developers of such systems to understand what the systems have learned and to relate it to the activity they are seeking to detect. - Put simply, there are too many seredipitous directions, or - The distinguishing behaviour is insufficient to establish either necessary or sufficient conditions for maliciousness. This talk will explore a number of examples that help to illustrate the problem and is intended to serve as a cautionary tail for workers in the field John McHugh is a professor and Canada Research Chair in Privacy and Security at Dalhousie University in Halifax, NS where he also directs the Privacy and Security Laboratory. Before joining the faculty at Dalhousie, he was a senior member of the technical staff at the CERT Coordination Centre, part of the Software Engineering Institute at Carnegie Mellon University where he did research in suvivability, network security, and intrusion detection. He was also affiliated with CyLab and the Center for Wireless and Broadband Research, both part of the Department of Electrical and Computer Engineering at CMU. Prior to joining CERT, Dr. McHugh was a professor and chairman of the Computer Science Department at Portland State University in Portland, Oregon where he held a Tektronix Professorship. He has been a member of the research faculty at the University of North Carolina and has taught at UNC and at Duke University. For a number of years, Dr. McHugh was a Vice President of Computational Logic, Inc., a contract research company formed to further the application of formal methods of software design and analysis in support of security and safety critical systems. While at CLI, he developed tools for the analysis of covert channels in multilevel secure systems and worked on the problems associated with the efficient implementation of formally specified systems. He has also worked for the Research Triangle Institute , the Naval Research Laboratory, the National Oceanic and Atmospheric Administration, the University of Minnesota, and the U.S. Patent Office. Dr. McHugh received his PhD degree in computer science from the University of Texas at Austin. He has an MS degree in computer science from the University of Maryland, and a BS degree in physics from Duke University. He is the author of numerous technical papers and reports. He has served as the chair of the IEEE Computer Society's Technical Committee on Security and Privacy and is a member of the advisory board for the International Journal of Information Security He serves on the program or advisory committees of many of the major conferences and workshops in the computer security field.

Coast to Coast Seminar "The Riemann Hypothesis"

Mathematics

Date: Sep 27, 2005
Time: 11:30 -
Room: ASB10900

Abstract

Arguably the most important unsolved problem in mathematics. The Clay Mathematics Institute has initiated seven one million dollar prizes for what it considers the most outstanding mathematical challenges for the new millennium. One of them is for the the Riemann Hypothesis, the nature of the problem and its effect on mathematics is the focus of this lecture.

Coast to Coast Seminar "Mathematical Visualization and Other Learning Tools"

Dalhousie University

Date: Sep 13, 2005
Time: 11:30 -
Room: ASB10900

Abstract

Current and expected advances in mathematical computation and visualization make it possible to display mathematics in many varied and flexible ways. I'll explore some of the present opportunties to integrate graphic and other tools into the curriculum --- for pedagogic and aesthetics reasons.