Workplace trends must be understood by all business execs

Future ready revolution - 7 emerging workplace trends

 

A workforce transformation is taking place. There is a strong appetite for organisations and their employees to innovate with technology, to become more productive and to adopt the latest technology trends. Business decision makers must be aware of these developments and help their workforce to make the most of these advances in technology. Below, we discuss seven emerging trends that are impacting how your employees work.

1. Making mobile working the new normal

European workers tap into a range of devices to complete their tasks, from traditional desktops to tablet devices and onto smart phones. Mobile working is, in many ways, now the accepted way to work – and business leaders are investing in that approach. Researcher Forrester says 48% of employee-facing IT investments are mobile-focused, while analyst Gartner says 70% of firms are providing mobile support to employees as a high priority.

2. Providing support for varied workstyles

Future ready revolution - 7 emerging workplace trends-Body Text ImageRemote working provides great benefits but not everyone wants to quit the office. More than half of German employees (53%) still do their best work at office desks. Most British employees also complete their best work in the office. Technology can help employees, such as on-the-go professionals, to complete tasks from all locations. But in-person interactions are also key to productive work, a fact noted by 88% of German employees.

3. Building smarter workspaces for all

If you are heading to an office, you want an enjoyable experience – yet few businesses are using technology to make smart workspaces a reality. As many as 47% of UK employees say their workspace is not smart enough and does not make use of key technology like the IoT. French employees, meanwhile, would rather have high-tech perks like IoT and VR than low-tech perks such as free food. Your business must think smarter to keep workers happy.

4. Separating work and home effectively

Technology allows employees to work remotely but there is also a danger it can allow business concerns to eat into personal lives. Different workers have different strategies. As many as 80% of British employees keep their personal and work lives mostly separate. It is not easy for everyone, however – German employees say keeping work and personal lives separate (34%) is the biggest concerns for remote workers. Your business must support a balanced approach to home and work.

5. Supporting the sharing economy

New models of working and living are on the rise. More than two thirds (69%) of French employees, including 83% of millennials, say they plan to participate in the sharing economy, where assets or services are shared between private individuals and typically via the internet. Almost two thirds (60%) of UK millennials also plan to buy or sell goods and services through the sharing economy. Business leaders must understand these new models and think of their potential impact on day to day operations.

6. Understanding how millennials think differently

More than half (52%) of French millennials say work is a key part of their social life, according to Dell and Intel’s research, which demonstrates how young Europeans are less keen to work in an office, are eager to embrace advanced technology and are more likely to quit a job because of poor IT. Business leaders must find ways to support the working demands of these talented newcomers to the European workspace.

7. Making the most of advanced technology

Gartner says global IT spending will reach $3.5 trillion in 2017, with the analyst saying firms are keen to invest in new software and services. Your employees are also eager to get their hands on new tools. Dell research suggests European workers are keen to make the most of advanced technologies, like IoT, VR and AR. More than two thirds (69%) of French employees, for example, say AI could make their jobs easier. Senior executives must find ways to support these developments for the benefit of the business.

Understanding key workplace trends will be key to your future business success. Explore the latest technology trends and insights to keep you ahead of the curve. Click here to find out more about building the workforce of tomorrow, today.

 


 

References

http://www.forbes.com/sites/ajagrawal/2016/01/11/8-tech-trends-changing-how-we-work-in-2016/#3a75c0167013

 

Mark Samuels

Mark Samuels

Mark Samuels is a business journalist specialising in IT leadership issues. Formerly editor at CIO Connect and features editor of Computing, he has written for various organisations, including the Economist Intelligence Unit, Guardian Government Computing and Times Higher Education. Mark is also a contributor for CloudPro, ZDNetUK, TechRepublic, ITPro, Computer Weekly, CBR, Financial Director, Accountancy Age, Educause, Inform and CIONET. Mark has extensive experience in writing on the topic of how CIO’s use and adopt technology in business.

Latest Posts:

 

Tags: Business, Productivity, Workforce Transformation

5 security features of Windows Server 2016

5 security features of Windows Server 2016-Main Article Image

 

With the release of Windows Server 2016 Microsoft has taken a serious look at several key aspects in the security features of the Server 2016 platform. Many of these improvements are not simply refreshes or updates to existing security measures, in many cases these are fundamental changes to Windows Server that are designed to address the continual security challenges of businesses today.

Credential Guard – One of the challenges with Windows Server was the fact that account security information was locally stored within the server. While it was protected, and encrypted it was still accessible because this information was local to the server and it resided in the local server memory. This meant that it is was possible for a rogue program to access that process memory and retrieve the credentials within it. Microsoft has now placed the security authority in its own virtual machine running isolated from the host. This will then protect the credentials in memory in the event that the host machine has had a security breech. The credential security has been a lingering problem for Microsoft and utilizing a micro-virtual machine to solve it is truly an inventive solution.

Windows Defender – One key improvement to mention is the continual enhancement of the Windows Defender application. While this is not a new product, it has undergone several updates to make it a more comprehensive security product for Microsoft. Enhanced real-time virus and malicious program scanning help to put it on par with other third party offerings. While Windows Defender may not replace your current third party packages today, its continual course of improvements, deep integration with the base OS and management abilities does warrant consideration as you make your security refreshes.

5 security features of Windows Server 2016-Body Text ImagePrivileged Identity Manager – While a very technical product, it addresses the most non-technical security risk in Windows today, the administrator. For the Windows Server Admin, having the highest level of security permissions was often necessary to doing the daily admin job. However, with administrator level credentials getting stolen and used for data breaches, this needed to change.  Privileged Identity Manager does two key things: it can limit the administrator’s rights to just enough access to get the tasks done, and it can limit that access to the defined period of time needed to complete those tasks. This removes the 24×7 “god-level” access most admins have in favor of a balance between access to do the job and security.

Hyper-V Shield – While one of the benefits of virtual machines is portability, it is also one of the security risks. Cloning and removing a virtual machine offsite poses a real risk to any business. With the new Hyper-V Shield, virtual machines can now be encrypted at rest and will only power up if they are verified by the correct trusted platform module from the original secured host.

Device Guard – While anti-virus and malware protection creates a ‘blacklist’ of processes and prevents these unauthorized applications from running, Device Guard takes a slightly different approach. Device Guard creates a ‘whitelist’ of applications and binaries that can run in key parts of the OS. By only allowing certified binaries to run, it helps to overcome the time lag of detecting a new blacklist application and updating protection before it can strike.

With Windows Server 2016 Microsoft has worked to enhance and introduce several key security aspects of its premier data center operating system.

 

Brian Kirsch

Brian Kirsch

Brian Kirsch is an IT Architect and Instructor at Milwaukee Area Technical College, focusing primarily on the virtualization and storage environments. He has been in information technology for over 20 years and has worked with VMware products for more than 10 years. Kirsch holds multiple certifications from VMware, CommVault, Microsoft and Dell EMC, he also sits on the VMUG Board of Directors a role he has held for the last five years helping to guide and shape the user community. Kirsch also provides a direct line of communication from customers to the VMware product architects through the VMware Customer Council and the VMware Inner Circle.

Latest Posts:

5 ways that your competitors are upgrading their IT infrastructure
5 eye-opening reasons to upgrade to Windows server 2016

 

Tags: Software, Technology

Automation in practice, how software robots work

Computer Works

 

Software automation is not automatic software as such. Software automation is our means of describing automatic elements inside wider software systems, software applications, smaller software agents and wider software architectures that can be programmed to produce defined automatic responses and controls based upon environmental conditions within the wider realm of the total system’s operational flow.

Technology analyst house IDC points us to recognise and realise that we now stand at the point of increased use of specialised increasingly automated management tools inside datacentre networks.

Deployed intelligently, this layer of technology can a) allow different parts of the software-defined datacentre technology stack to work well in unison and b) provide the necessary layers of separation between compute and storage parts in the total IT stack to ensure operational efficiency is maximised.

Stepping stones to automation

Employees in office lobbyIDC views specialised tool adoption as a stepping stone toward end-to-end management and finally automation — the firm argues that as soon as enterprises see the efficiency gains that these kinds of specialised tools provide, the focus will turn to end-to-end management and automation.

According to IDC, “This notion is also illustrated in the current levels of automation across general IT infrastructure and networking in particular, where the use of automation, particularly in DC networking, is higher than the use of automation in IT infrastructure overall.”

Where automation manifests itself

Software automation controls will form part of intelligent network architectures inside the modern software-defined datacentre. Where some automation will be tasked with responsibilities for core monitoring and reporting functions, more sophisticated automation will perform a basic level of remedial service where problems occur.

These ‘problems’ may not mechanical issues as such; it is more likely that automation controls will be tasked with jobs such as reapportioning and rechannelling data to avoid bottlenecks when and where they occur.

Automation as a ‘key five’ factor

In the new and contemporary software-defined datacentre we see automation line up alongside four other key paradigm-shifting trends, together these are the key five factors reshaping our networks

  1. Automation
  2. Software-defined networking (SDN)
  3.  Network virtualisation (including NFV)
  4. Open networking standards
  5.  An end-to-end infrastructure management

We can reasonably expect enterprises to adopt SDN, open networking, network function virtualisation and the associated trends of automation and end-to-end infrastructure management quickly, once certain conditions are met both a) in individual use cases and b) across the market as a whole.

A ‘cautious increase’ for automation

According to IDC, a dominant view on the future of automation is a cautious increase. In a recent survey the analyst house found that for general IT, 56% of respondents indicated that they plan to adopt a bit more automation and 3% said they plan to automate a lot more.

So are automation controls actually robots? Well yes, in a sort of virtualised abstracted non-physical software-defined world, they actually are. An easier question is… are automation controls an increasing part of our future networks – and the answer is yes.

 

Adrian Bridgwater

Adrian Bridgwater

Adrian is a technology journalist with over two decades of press experience. Primarily, he worked as a news analysis writer dedicated to a software application development ‘beat’; but, in a fluid media world, he is also an analyst, technology evangelist and content consultant. He has spent much of the last ten years also focusing on open source, data analytics and intelligence, cloud computing, mobile devices and data management.

Latest Posts:

 

Tags: Software, Technology

It’s not the size of your data, it’s what you do with it

people-woman-coffee-meeting (1)

Blackboard is a leading education technology company with 100 million users and 16,000 customers using its solutions across 90 countries.  They’ve seen the adoption of mobile for learning becoming commonplace and have helped drive the debate on BYOD and big data in universities.

Higher education is just one of the many sectors that has heard talk of big data and wants to move ahead quickly to make the most of it. However, ‘doing’ big data is not what it’s about. If big data can be defined as more than one terabyte of data stored over many machines or billions to trillions of records of millions of people—all from different sources, is it even fair to call what educational institutes are attempting ‘big data’?  No university can boast that much data but that doesn’t mean they can’t use analytics for unprecedented institutional and academic improvement.  The key is how the data they have is interpreted, analysed, and how those insights are applied in strategic and operational decisions.

Big data is, by definition, complex. And, when it’s multi-sourced, it’s challenging to detect patterns that are meaningful. All universities monitor student activities; reporting the number of admissions, the applications accepted and the number of students who achieve academic qualifications.  Yet, the annual reviews required by most Governments are ineffective when it comes to developing a good understanding of the bigger and smaller picture: The big picture of how successful a particular course is across all students, and the small picture of how an individual is coping day to day.

Evaluations performed at the end of the semester, module or course can highlight areas of concern but it may be too late to provide remedial support to individuals who have struggled to progress. Fortunately, there are universities who have realised that the data at their fingertips can provide a real-time view on students’ performances and likeliness to succeed and they don’t need to wait for the official assessments before they know about any teaching or learning issues.

The majority of teachers and professors rely on their own interaction with students to monitor their progress and recognise if a student is falling behind.  But, with the growing diversity of students and increasing class sizes, they will struggle to stay up to speed on each individual’s success rate and react in time to provide steps to help.  Technology can help and Learning Management Systems can provide an alert if assignments are late or attendance drops. However, this doesn’t offer the insight that could be achieved if the student data was analysed more discerningly.

The idea of initiating an analytics project can be daunting, but it is possible to start in a smaller yet effective way. First and foremost it is important to have clarity on what, how and why the data is being monitored. Student success, for example, is the goal of many analytics projects. Retention, progress, and academic achievement all determine success and universities need to know how success is defined for them, what are the data points, and how they will collect and analyse data.

It is worth remembering that regardless of how many data points a university holds about a student, they need to identify initially only those factors that have the most significant influence on student success.

Once the influencing factors have been decided, the next most important aspect is to have the data points for these available for the required period, without gaps.  For example, if student attendance in class is an influencing factor, then the data should be stored and be accessible for every occurrence of the class.  Or, if assessment grades are an influencing factor, then these grades should be stored and accessible for every assessment.  In that way, from the first lesson and the first assignment to the last, the teacher and faculty will know if a student is on track to succeed.  It’s also a way of highlighting any issues early enough to address them.

Data analytics is a science but big data doesn’t have to be ‘big’ to benefit universities and their student body.  The key is to identify the relatively few influencing factors that are useful indicators of student progress and make use of that information to support their learning. So far, many universities who have tried data analytics are still working project by project and only a minority has initiated a plan that can deliver the scale, scope or frequency to make the results of any report meaningful.  But we see that changing.

The objective is that data analytics will become the backbone of university administration and learning development.  If universities can positively affect student retention and qualification success, this will help them attract good teaching staff, future students and research project sponsors.  It’s more than worth the investment – even if the big data projects have to start quite small.

 

Dr Demetra Katsifli

Dr Demetra Katsifli

Dr Demetra Katsifli, is Senior Director of Industry Management at Blackboard. Her professional career in Higher Education spans 33 years and has focused on leading enterprise services and systems to underpin the core business of education. Demetra’s research interest is in educational technology in Higher Education.

Latest Posts:

 

Tags: Big Data, Education, Industries

Let the Robots Worry About The Work While Humans Handle The Data

Service not robots central to the re-shaping of manufacturing industry

Robot data main

Apple supplier Foxconn last month revealed it has replaced 60,000 factory workers in one of its plants near Shanghai in China with robots. While the manufacturing sector is no stranger to robotics and automation replacing humans (you only have to look back to Fiat’s 1979 prophecy Hand Built by Robots to realise that), Foxconn’s move is definitely dramatic.

The sheer scale of change in just one factory will have sent reverberations across the globe. Certainly the sector, like many sectors, is going through a rapid adoption of new technology as well as coping with increased globalisation, but the biggest impact is not necessarily from robotics. Rather it is in how organisations are re-shaping and re-focussing on data.

The Internet of Things (IoT) is essentially triggering this change and what this data – collected from machinery, products and factories and fed back into analytics software to be reviewed by specialists – is doing, is forming the back bone of manufacturer decision making. It’s a digital revolution that is enabling more informed design, processes and innovation, but only if it has context.

In a manufacturing context, digital goes through the whole piece, from the design concept, to actually modelling the manufacturing process before you’ve even cut metal. And then through to manufacturing environments and collecting performance data, which is all about understanding manufacturing capacity.

What is often overlooked is that digital is in fact driving which products and services manufacturers actually deploy. While he accepted that servitisation is increasingly central, and that a lot of companies are “extracting a lot of value out of delivering a service rather than a product,” the fact is that service should now be integrated within a business and not be viewed as a bolt-on, a nice to have. It should provide the intelligence which shapes and determines the future of manufacturing.

And this is the point. Technology is enabling manufacturers to differentiate through service and even make money. No longer a cost centre but a profit centre, with the ability to upsell, as well as feedback vital customer information, service is now increasingly the vehicle for customer and product intelligence. It puts data into context.

Of course the role of the field service tech is changing too. It has to. As the importance of service grows, so the field service techs have to evolve with it, and develop new skills. That means working with more functionality on their tablets, phones and even their clothes, as well as handling data collection, reporting and keeping customers happy.

Interestingly, according to the 2016 KPMG Global Manufacturing Output report, 49% of global manufacturing executives plan to significantly change the range of services they offer in the next two years, while 45% are concerned about the relevance of products and services they offer. With 44% concerned about customer loyalty, it’s easy to see how service techs can play an increasingly essential role within the manufacturing sector. Data, and the intelligence surrounding that data, are the lifeblood for innovation and growth.

 

Joe Kenny

Joe Kenny

Joe Kenny is Senior Director of Global Customer Transformation/Customer Success for field service management leader, ServiceMax.

Latest Posts:

 

Tags: Data Center, Software, Technology

Breaking the fourth wall of product design

vr2

If you’re a sci-fi aficionado, you’re probably familiar with the work of William Gibson. He’s been predicting the rise of virtual reality through his novels since the late 1980s. When he first experienced the new technology in 2015, the 67-year-old author simply proclaimed, “they did it”. As entry-level virtual reality becomes more affordable and accessible, Geoff Turner of product lifecycle management (PLM) provider, Design Rule, explains how this technology can be used to enhance product design.

There has been a barrage of media coverage proclaiming 2017 as the breakthrough year for virtual reality, and this is partly due to the dramatic reduction in the cost for the technology. In fact, statistics by Deloitte suggest that the Virtual Reality (VR) market is expected to break the £710 million barrier for the first time this year. VR holds enormous potential for many industries, from architecture and manufacturing through to marketing and public relations. The technology has dozens of possible applications and, so far, we’ve only scratched the surface of what it could achieve.

In the product design realm, computer-aided design and other digital technologies have been an essential part of a product designer’s toolkit for a long time. However, designers are only just starting to strap on virtual reality headsets and beginning to embrace the technology.

Larger organisations have invested heavily in Power Walls and Caves for VR but applications, whilst valuable, have focussed on customer demonstration, digital mock-up and training. Whilst costs are falling there have also been limitations on the hardware; tethered headsets can be a limitation and a lot of the applications still require graphics cards. The next generation of wireless headsets, more efficient applications and integration with existing systems e.g. Product Lifecycle Management, looks to be changing all that.

Beyond basic design

For product design, the applications for VR can begin before a designer’s pen hits the paper. If you have visited any kind of trade show or exhibition in the past twelve months, you are bound to have seen an increase in the use of VR headsets on exhibition stands. Naturally, VR is a great fit for trade stand engagement, ensuring a lasting impression will be made on visitors to the stand. However, VR at trade shows can also provide the perfect opportunity to collect customer feedback.

Creating an immersive experience for the customer offers the chance to showcase an intimate experience of a product in use, away from the salesroom and with near real-life quality. Here, customers can suggest improvements to the product, changes they would like to see made and pinpoint potential issues with the ergonomics of the product. What’s more, with an experienced CAD designer on the stand, these changes could be made in real-time, to showcase the importance of user experience to the customer.

Benefits of virtual reality for design and industrialisation

For designers, one of the most notable advantages of VR is to be able to visualise a product at numerous stages during the product design and manufacturing process. Using this method, designers and production engineers can pinpoint potential problems and make the relevant design changes before any physical prototypes have been commissioned. In recent years, rapid prototyping technology has enabled designers to create highly accurate early prototypes. Using detailed design data drawn from the PLM platform, combined with advanced CPU’s and graphics cards, virtual reality prototypes can now look and feel exactly as the product is going to for the customer – rather than simply producing a good copy.

Naturally, this level of product simulation can save time for designers, but for business owners the ability to reduce the need for multiple design and manufacturing changes can reap significant financial benefits further along the production process. This method has been further improved by techniques such as 3D printing and additive manufacturing. However, VR takes this digital product simulation to the next level.

A blurring of the edges

Whilst VR is gaining traction in design and at the desk, Augmented Reality (AR), a composite view the real and digital worlds offers a host of possibilities to companies. It is used to simulate a product in a real-world scenario, prior to even being built, using high definition rendered images or video to generate customer interest and demand. Maintenance applications use it to guide an engineer through a procedure and collect documentary evidence. Smart glasses can be used to relay information to the operative and the camera recognise features that it sees. It can also be used to compare physical prototypes with digital mock-ups or part of a First Article Inspection for finding discrepancies.

Designers can use augmented reality to assess ergonomics, including layout and the placement of controls in an environment where the digital design data can be displayed to show how it will look. During this time, the production engineer can validate the assembly stages of the product. The manufacturability of the product can be evaluated and ergonomic concerns with the assembly process can be assessed in a virtual a factory environment. Design and production engineers can then to evaluate each configuration and settle on final options as it allows for the rapid development of virtual prototypes for customer design review

So what’s next?

As application capability has broadened, augmented reality has become part of a media called mixed reality. The term has been around for a while, but it has now become more mainstream with the ability to stream digital and physical reality in a hybrid environment.

Enter holographic computing, billed as the next major technology disruption after the smartphone, where glasses, like the Microsoft HoloLens, will replace today’s flat, 2D screens. Whilst the use cases are still being defined for industrial consumption, the potential seems to be considerable. With the ability to fully manipulate a digital model and enhance it with detailed design data drawn from the PLM platform, engineers can see the design intent and specification as they view an article, either digital or real.

When the initial products are produced, as part of a review or first article inspection, the engineer can receive information such as instruction, intended measurements, material and finish, through the headset whilst interacting with the real-world object. This provides a much richer dataset than traditional first article inspection. Results can be output in report form and design changes can be tested in the mixed reality environment, prior to the commitment to new or reworked parts.

As a concept, VR has been explored and fantasised about for decades. In fact, early versions of virtual reality technology have been around for more than 20 years. It’s only in recent years have we seen the technology really take centre stage. When author William Gibson exclaimed “they did it” upon experiencing 2015’s tech, it wasn’t the first time he had tried any kind of virtual reality. However, it was the first time that virtual reality had truly fulfilled the fantasy he had written about years beforehand.

As VR technology continues to advance at a rapid pace, as well as becoming more affordable and accessible, it is likely that technology is going to take a more prominent position across all industry sectors – including product design and manufacturing.

 

Geoff Turner

Geoff Turner

Geoff Turner is a product lifecycle management (PLM) and technical consultant at Design Rule. Design Rule is a value-added reseller of Dassault Systèmes design software, supplying the likes of CATIA, ENOVIA and 3DEXPERIENCE to a varieties of industries. The company has a wealth of experience in design software for the aerospace, defence and engineering industries and has more recently introduced software for consumer goods and retail markets.

Latest Posts:

 

Tags: CAD, graphics & design, Technology

Workstations for designers: going beyond the PC.

5 awesome reasons why design pros need a smart workstation

 

PCs and dedicated design work do not make good bedfellows. Here are 5 reasons why a dedicated workstation is a much better fit.

1. Capability to handle large files

A basic requirement – but one that a PC can struggle with. Files used by designers are not small – and cannot be made small without losing image quality. It is an imperative that the chosen workstation can load and work with very large files.

A dedicated workstation can have large amounts of working memory in place alongside multiple CPUs, and may also accommodate PCIe/M.2 solid state storage providing far faster file loading and handling capabilities than a PC.

2. High definition, fast graphics

5 awesome reasons why design pros need a smart workstationWhereas most PC users are happy with screen definitions of 72/96dpi, workstation 4k/5k screens can operate at up to 250dpi. This enables working at the levels of detail demanded for high definition print and broadcast publishing. Editing images can be done at full print level: even zooming in on parts of an image will not result in over-pixelisation of the content.

3D graphics need the mix of fast memory to deal with large files alongside suitable graphics cards and monitors to provide smooth movement. It may be tempting to regard high-end gaming PCs as options; however, these are tuned to different needs around graphics rendering of game-based content. There remains a gulf of difference between a games-focused graphics card and a design-focused one.

3. Not just touch-enabled

Touch-enabled screens are becoming more ubiquitous. People are used to them on tablets and smartphones, and are increasingly using them on laptops. However, PCs and workstations have not been at the forefront of touch-enablement – but this is changing.

The use of multitouch-enabled screens enables designers to be able to pinch and pull work to zoom in and out; to use a pressure-sensitive stylus to draw and edit; to markup copy directly within a composite directly on screen.

New capabilities are also coming through – dedicated ‘pucks’ or ‘dials’ are adding greater flexibility and functionality to how designers can interact with workstations without the need to reach for a keyboard or mouse.

 4. Multiple screens

Designers work against composites that consist of text and images.  There is a need to edit the graphics and edit the text while avoiding ‘alt-tabbing’ between windows to do this.

Support for multiple screens means that not only can several items be worked on and seen at the same time, but that the impact of a change can be observed directly in the composite as changes are made.

However, 4k/5k screens and touch-enablement do not always work well together. Some newer workstation designs are looking at combining a 4k/5k screens with a lower definition digital work surface used as an advanced touch-enabled environment allowing for content control while still viewing the result in high definition.

 5.  Tuned to your applications

PCs are not really tuned – they have to be general workhorses that can run anything that is installed on them as best they can.

A workstation can be tuned to run a specific application – whether this is Adobe Creative Suite, Autodesk, PTC or whatever. As such, you will get the best possible performance that can be squeezed out of the application – at the CPU, storage, memory and interconnect levels.

Click here to view more discussions about CAD, graphics & design.

 

Clive Longbottom

Clive Longbottom

Clive Longbottom has been an industry analyst for around 25 years. Having trained as a chemical engineer, he sees everything in terms of process. Clive has worked in several positions across medium and large sized organisations, giving him a strong understanding of what an organisation needs – which isn’t an unquenchable thirst for technology. Clive looks at everything from the point of “What could this do for me, my team, my department, my organisation, the value chain?”, which enables him to weed out the very important, often quite basic, technologies from the more elegant, yet less useful ones.

Latest Posts:

 

Tags: CAD, graphics & design, Technology

Rugged devices – the key to productivity for many roles

5 jobs that require a rugged device-Main

 

Ruggedised devices combine a mix of hardened dust-and water-proofed cases, toughened screens and dedicated input tools, often with dedicated applications for a specific task. A basic example is with postal delivery agents, requiring the recipient to sign off a delivery on an electronic device. However, such dedicated small devices do not meet the needs of all workers – it is often the case that a larger rugged tablet or laptop capable of running multiple apps is required.

Here, five jobs are considered where a more rugged device makes more sense.

1. Field engineers

Engineers sent out to fix, for example, utility systems need not only simple job ticketing systems, but also need access to instruction manuals, blueprints and ‘how to’ videos. Standard laptops and tablets do not have the right attributes to cope with the rigours of what is required, whereas ruggedised versions can cope with the knocks, dust and damp that such work entails.

2. Construction workers

5 jobs that require a rugged device-BodThose working on construction sites need systems that can provide access to plans and also take inputs – but can also take being dropped, being run over by construction vehicles and so on. Ruggedised tablets that can be used one handed but can also take clip-on physical keyboards can ensure that these workers get what they need.

3. Social workers

Social workers have a need for devices that can be used for job ticketing, checking records of the people they are dealing with, and to input further information while in the client’s house. This could involve a mix of tick-box input via a tethered stylus alongside textual input via a soft or hard keyboard.

Not only should the device be built to cope with being out on the streets, it also needs to be security hardened to ensure that the personally identifiable information (PII) central to a social worker’s job cannot be compromised.

4. First responders

The police, ambulance and fire services are becoming increasingly dependent on accessing information from central points. For example, police need to be able to check car registrations, insurance details and suspect’s details; paramedics need to be able to check medical histories and look up possible treatments for rarer problems; fire service personnel need to check blueprints for building configurations and access to necessary utility services. Each may also have a need to record data via voice, image or video using the device.

Such access may need to be carried out in tough situations: shock-proof casings and hardened touch-enabled screens alongside easy touch or voice operation are almost necessities.

5. Loss adjusters

Insurance companies that act faster to insured events will be winners in the market. The need to rapidly collect information at the point of incident is driving an increasing move to loss adjusters using devices in the field, leading to faster decisions on whether to pay out or not.

This needs devices that can cope with being in all sorts of environments, from minor household or road incidents to burnt-out or flooded buildings, environments with hazardous chemical spills and other major incidents.

The chosen device must be able to cope with contact with all the surroundings, and must also allow for the easy input of data – including image and video via the device’s camera.

Click here to view more discussions about CAD, graphics & design.

 

Clive Longbottom

Clive Longbottom

Clive Longbottom has been an industry analyst for around 25 years. Having trained as a chemical engineer, he sees everything in terms of process. Clive has worked in several positions across medium and large sized organisations, giving him a strong understanding of what an organisation needs – which isn’t an unquenchable thirst for technology. Clive looks at everything from the point of “What could this do for me, my team, my department, my organisation, the value chain?”, which enables him to weed out the very important, often quite basic, technologies from the more elegant, yet less useful ones.

Latest Posts:

 

Tags: CAD, graphics & design, Technology

CAD, graphics & design – Dell-Emc UK – Dell UK

30 Years of Workforce Transformation

We’ve come a long way, but the future looks even brighter. At Dell EMC World 2017, Chief Marketing Officer Jeremy Burton took the audience through 30 years of tech transformation and shared Dell’s exciting innovations for the workforce of tomorrow.

Check the infographic on workforce transformation over the past 30 years.