MapleSim 6

After an initial lockstep phase in which MapleSim upgrades were closely tied to those for Maple, release 6 was held back and appeared shortly after Maple 16. It may be that this marks the point when the simulation package asserts its independent existence as an application in its own right, rather than a younger sibling in the shadow of its older stable mate.

Not that the links with Maple have loosened in any functional sense; in fact, their potential for integration continues to grow. Maple can now transparently open MapleSim (and Modelica, of which more in a moment) models within its worksheets and make thread safe MapleSim function calls from its TPM (the high level, multithreaded, Task Programming Model). Information from the MapleSim model, including system equations, can be accessed, as can MapleSim's full solver repertoire. And Maple can set or change MapleSim parameters and solver settings, so is able to operate as a symbolic front end. The two products remain firmly complementary, but not co-dependent; MapleSim is developing a wider vision.

Part of that wider vision is an expanded level of interaction with Modelica. MapleSim has a connector for Modelica's FMI (Functional Mockup Interface), allowing models from MapleSim to be reused by other FMI-aware components, and an enhanced ability to use Modelica functions and libraries. Modelica files can often be imported, MapleSim files exported or models directly saved in Modelica format with topologies and presentation specification intact. External code generation includes a Modelica option. With Modelica becoming increasingly dominant in the simulation and modelling field, MapleSim is firmly establishing its claim to a significant position in the evolving ecology.

Other interconnectivity extensions include connectors for B&R Automation Studio connector and VI-Grade's VI-CarRealTime. The latter already has a significant industrial result in the form of a force feedback simulator for use in compressed automotive design and development cycles. Existing links to the likes of Matlab gain speed.

Returning from outreach to core function, there are, as you would expect, improvements and extensions across the board. The range of solvers has increased and their options expanded, both resulting in extended flexibility, and symbolic pre-processing speed is up.

There are several new components and expanded coverage for those already in place. Given my own area of work, the new mean functions in the mathematics range are prime favourites but the n-port multiplex and demultiplex components for combining and separating signal channels have caused more hearts to flutter amongst colleagues who have taken my copy for a test drive. Multibody extensions simplify and streamline many tasks and the new to/from signal blocks (conceptual port links without need for explicit connection routes) are a delight whose value becomes increasingly obvious in dimensionally complex models.

Taking a quick sample of other key developments, we have improved control in most areas, from heuristic event compression to reloadable process interrogation snapshots, constraint stabilisation to hysteresis bandwidth, error tolerance, variable scaling, and so on.

There are also usability improvements to the user interface, visualisation and other areas too numerous to mention and small in themselves, though their combined productivity benefit is considerable.

Feature

Building a Smart Laboratory 2018 highlights the importance of adopting smart laboratory technology, as well as pointing out the challenges and pitfalls of the process

Feature

Informatics experts share their experiences on the implementing new technologies and manging change in the modern laboratory

Feature

This chapter will consider the different classes of instruments and computerised instrument systems to be found in laboratories and the role they play in computerised experiments and sample processing – and the steady progress towards all-electronic laboratories.

Feature

This chapter considers how the smart laboratory contributes to the requirements of a knowledge eco-system, and the practical consequences of joined-up science. Knowledge management describes the processes that bring people and information together to address the acquisition, processing, storage, use, and re-use of knowledge to develop understanding and to create value

Feature

This chapter takes the theme of knowledge management beyond document handling into the analysis and mining of data. Technology by itself is not enough – laboratory staff need to understand the output from the data analysis tools – and so data analytics must be considered holistically, starting with the design of the experiment