In 2016, online education continued to become increasingly popular. More trainees are now enrolled in online courses than ever. World class universities have even begun trialling virtual and augmented reality as a learning tool. Digital companies released nano-degrees; while employers including Google and Goldman Sachs, started exploring and adopting online digital credentials. On the other hand, enrolment for on-campus degrees was reduced for the majority of United States company schools
As the calendar rolls over to 2017, here’s our list of the top 3 educational technology innovations to keep an eye out for this coming year:
Artificial Intelligence (AI) is becoming increasingly popular as a tool for online education despite it being one of the most overrated innovations of 2016. By utilising computer systems processing power and combining it with the cognitive abilities of the human brain, AI is already used heavily in customer tech markets. While specifically used to power applications like Siri for the Apple products and Alexa for Amazon products; Jozef Misik, handling directorof Knowble, a language tech start-up whose products are built on AI predicts this technology will extend beyond personal assistants into the broader education industry in the near future. The increased improvements in user accessibility of AI programs combined with the falling cost of the technology will result in more educators integrating these applications into their courses
The benefit for instructors will be devices capable of carrying out standard training jobs. Algorithms can aid instructors to assess learning effectiveness as well as support delivery of content. Deep learning systems can read, compose and mimic the behaviour of people. Current examples include the “intelligent tutoring” system pioneered by Colorado State University to improve marking reliability of assessments, to the virtual education aid, ‘Jill Watson’ utilized by Georgia Tech in 2015, or the average physics tutor.
There will be opposition to AI, such as when Moocs was introduced. Certain professors will undoubtedly be against learning to integrate the advancement of applications and the design of algorithms to successfully manage these new platforms. Concerns that staff will be displaced are already an issue raised. And as other eagerly integrated innovations have revealed, students will not always adopt new technology unless there is a real benefit
“AI in education is not inevitable — but it’s necessary,” says Satya Nitta, director of education and cognitive sciences at IBM.
Need for a conventional university degree has reduced, meaning there is currently a real focus on alternative “micro” credentials. These courses are not part of degrees and instead deliver specific knowledge and skills in fields like information science where there is a scarcity of skill. The cost is significantly lower than that of an MBA and by taking multiple courses a personalised academic experience can be created.
With more companies recognising online education credentials, and professionals switching across employers and careers more regularly combined with the higher participation of startup events, anticipate the trend to grow rapidly. People will have to continuously re-skill, claims Anant Agarwal, CEO edX, an online educational platform. “This is the age of continuous learning,” he says. “Even when you own a master’s degree, education does not stop there.”
In 2012, when Moocs initially expanded, they were considered a threat to on-campus learning. However, in recent years universities have started mixing ‘bricks and clicks’. Many prestigious schools of business now operate well regarded “combined” versions of their MBA degrees that integrate both remote online lessons and in person evaluations. Approaching things this way has allowed educational institutions to maintain intimate environments where students can network with a chemistry tutor for example, while still offering flexibility.
In fact, the majority of universities are integrating tech into actual classrooms to improve educational outcomes. Technology will repeatedly alter how educators communicate with students and deliver knowledge. Course instructors are pushing for students to be familiar with content before lessons so that they are instead places for debate about key ideas and conflicting theories.
Using video cameras, GPS and microchips, NTU system tested effectively at Yishun website.
Through the aid of microchips, cams and satellite tracking, enormous tower cranes may someday be controlled from another location. By using both digital software that works out the best lifting course and modern cameras, a crane operator can now complete any task that would normally done from the crane cabin remotely. With Global Positioning System (GPS) tracking they can also evaluate how effectively they have followed the path calculated by the digital software
Researchers at Nanyang Technological University (NTU) created the “smart crane system” that was trialled recently at the Yishun executive condominium site by Kimly Building and construction site. One of the core features noticed during the trialling was the enabled by microchip precast part tracking.
Precast building is a method used to boost productivity, but is often still labour-intensive. This is due to workers having to enter every concrete part that arrives on site into a database. NTU’s system differs as Radio Frequency Identification tags are used instead. This allows for each slab to be scanned, which allows the information to be automatically entered into the database. The result is that real-time project models are created meaning the Building Information Modelling (BIM) system is updated automatically as individual slabs are moved into place.
“From the beginning, it’s electronic information that can be passed from one process to another,” described NTU research study fellow Meghdad Attarzadeh. Gains in performance have already be seen with an increase in efficiency of 10%-20% for site logistics and 30% reductions in time costs for checking of inventory
The BIM system is also utilized to compute the most effective and least dangerous lifting path of every single item. A GPS sensing unit attached to the hook of the crane monitors how closely the operator is matching the best course.
Often tower crane operator’s vision is obstructed, meaning they typically depend on signalmen to guide them from below. The new crane hook cameras now allow the crane operator to see exactly where they are positioned relative to the cranes surroundings.
“For the very first few aspects, it was a challenge (for the crane operators),” said Mr Choo director of Kimly Construction. “But by the last few, when they had built up the confidence, the speed really picked up”.
NTU Associate Professor Robert Tiong, leader of the research group, stated: “The dream is for a game-changing ‘smart crane’ system.” The system might permit cranes and drake low loadersto be run remotely in the future he also noted.
The partnership started in December in 2015, with NTU team investing months establishing and evaluating the system. For more than six weeks across October and November, the new system was implemented to build floors 10 and 11 on two blocks of the project. The trial was solely funded by Building and Construction Authority’s Productivity Innovation Project Scheme.
Researchers at NTU are looking to further develop the system by partnering with organisations like the JTC Corporation; a nationwide industrial developer. Kimly Construction is also hoping to use the system for future projects that are suitable. As Mr Choo States, “It helps the (precast) process become a more systematic method of construction,”
Is application testing for mobile similar to standard testing for software?
If the answer to this question was ‘yes’, then the lives of software would be easier, but unfortunately it’s a resounding no. Testing of mobile applications varies in several ways as the device range is much greater meaning standard software testing courses don’t cut it as you are testing for so many different things in a mobile software testing course
One of the key distinctions is the way in which Appium; the most popular screening tool for mobile apps differs from Selenium; which is one of the most common tools for screening of non-mobile applications. Appium and Selenium operate differently meaning the degree of automation allowed for in tests is implicated in a significant way.
By comparing Selenium and Appium against one another, I’ve been able to gain some insights about how to maximise the use of each tool. Below is what I found out.
Selenium the popular browser based suite of automation tools, is a vital application for screening since it automates web browser testing.Automating tests for web applications is not the only useful feature of Selenium, automating online administrative jobs is another useful feature. Because of this a lot of DevOps Engineers feel you’re essentially getting double the value with Selenium.
It’s important to remember that Selenium isn’t one single tool but instead several smaller tools which when combined make up two central functional parts
The largest issue with Selenium; which has become obvious due to the popularity of smart phones for everything, is that the program was not developed for mobile app automation testing. Fortunately Appium addresses this core issue. Based on Selenium (just like Selendroid), Appium is a program specifically developed for automated testing of mobiles.
Appium Architecture: Wrritten in Node.js using a REST API, Appium is a HTTP webserver. Apps do not need to be recompiled due to Appium employing automation frameworks provided by the vendor like Google’s UiAutomator or Apple’s UIAutomation. This allows for the app that you test to become the app that is delivered. Language-locking (also known as framework-locking) is excluded as automation frameworks provided by the vendor are embedded inside the WebDriver API. Client-server protocols are specified by this API. This allows for the appropriate HTTP server requests to be written in any language by the client.
Only a couple of alterations are needed to allow for the WebDriver protocol to deal with mobile as when it comes to web browser automation, WebDriver is the standard. Another benefit of Appium is that it’s open source. Github has freely available information on the Appium architecture if you want to know more.
Appium Sessions: In Appium, a session is simply the automation of an individual mobile internet browser test. A HTTP POST request to the server is sent out by every client with a “desired capabilities element (also known as a JSON object). Thus this allows for the server to simultaneously launch the session and respond with a session ID which can then be utilised later on in the session. A set of keys and values called desired capabilities indicate what type of automation sessions a user conducts by sending this information directly to the Appium server. Other capabilities can also be used during automation to modify server behaviour (a complete list of capabilities can be found through a quick google search). For the reasons above, Appium is viewed as one of the best methods for automating application screening for mobile web.
Testing of mobile applications is unique and differs greatly from standard testing of web applications due to multiple devices requiring a varied software tester to deal with different operating system versions. However, automation tools for internet browsers like Selenium can have their frameworks modified to make them compatible for mobile test automation. Thus Selenium can be used like Appium. Obviously, regardless of whether you choose Selenium or Appium, both tools can be utilised through Sauce Labs’ cloud-based testing platform.
The DEF CON 24 2016 event hosted the DARPA Cyber Grand Challenge which had seven automated security systems teams face off in a capture the flag type competition for the main prize of two million dollars.
The objective was to design programs that in an autonomous fashion detected vulnerabilities and self-patched to fight off system intrusions.
This technology is interesting in that it uses the intricacies related to today’s network environments to improve service security in several different ways. However, a key question is whether this technology is worthwhile as there are various unintentional negative impacts and dangers connected to patching done autonomously. And, while still a niche area, leaders of the industry should start considering how advancements in information security programs and connected systems could go awry in the future
It is true that many businesses could utilize some type of automated patching innovation. This would free up resources in many businesses as flaws in software would be acknowledged and patches would be launched automatically meaning future attacks on the system would be ineffective. The idea of automated patching technology is very appealing to both auditors and executives. However, patching in reality is a complicated process that encompasses several systems, people and processes that operate in tandem to ensure any updates released are released effectively to improve security and ensure there is stability in the future network environment.
Stability in the environment is the top priority of many security professionals as many have had first-hand experience of one bad patch taking down what was otherwise a completely stable system. Bad patches are typically seen as a worse outcome for the business than the initial security weakness that the patch was trying to fix for obvious reasons.The software quality of contemporary operating systems and applications is better than ever before, and by working through a software tester it is entirely possible to release patches in a semi-automated style. As a bare minimum, semi-automation should occur at the workstation level for third-party software applications from vendors such as Java and Adobe.
However, a key question is how can autonomous software and firmware updates be applied to servers, applications, databases and network infrastructure systems? Furthermore, the cost of patching via automated systems relative to the benefits have many experts questioning whether it is worthwhile. Possibly the advantages of automated patching do surpass the drawbacks, but it is necessary to point out that the DARPA Cyber Grand Challengeused a specific testbed of computer systems with customized software applications that had previously not been evaluated.
As DARPA is the first ever automated patching competition in information security, it is impressive that the systems built by competing teams took only 10 hours to find and patch vulnerabilities that would normally take several weeks or months to resolve. When considering complex legacy applications and enterprising software products however, it becomes unclear how effective autonomous patching systems will be. Each circumstance is unique; with the reality being, provided the minimal resources of business IT security groups and the proneness of people to make mistakes, security needs to be automated when possible.
It is in the current day that we should begin thinking about what processes in patching could or should be completed autonomously in the future. It is likely that by attempting to automate certain security functions now, we will be able to translate many of those systems and learnings into different areas which would greatly aid the development of future security efforts
When asked which laptop will meet someone’s needs, the key thing I tell them is that there is no decisive winner. There so many differing device types, and various price ranges to consider that it really does depend on who’s asking. However, what I do provide is a guide of aspects you should evaluate before spending lavishly on a brand-new computer with all the extras like an IT help desk.
Intel’s Core-based CPU’s are probably the best option when considering buying a laptop. In regards to multi-tasking and multimedia tasks the Core i3, i5 and i7 all perform better than their competitors.
Laptops with Core i3 processors are normally found in more simple systems that are aimed towards entry-level users who will only need the basics, while the large majority of laptops and computers sold through retail run Core i5.
If you’re looking to push your laptop to achieve maximum performance, then you can’t go past the Core i7. However, while it is the fastest CPU in the range, there are certain functionality issues as heat radiating from the bottom of the laptop makes it uncomfortable for long periods of actually using the laptop on your lap
The minimum amount of RAM required to ensure your system is performing at its best is 4gb. Essentially the greater the amount of RAM, the greater the number of applications you will be able to run simultaneously. Additionally, more RAM means the system can access more data at any point in time; which is highly useful if you plan to do quite a bit of image editing.
Bluetooth and Networking Wirelessly:
People commonly forget to check this when buying a new laptop and this is unfortunate as most people depend on a reliable internet connection to effectively and productively use their laptop. As a bare minimum, you should be looking at laptops that have Wi-Fi adapters which are dual-band. Most routers have 2.4GHz and 5GHz networks. By having a dual-band adapter you can work faster as 5GHz is a better connection type. Furthermore, the dual-band will let you keep separate your laptop from other devices operating on the 2.4GHz network. Good wireless connectivity is also essential for work purposes like cloud backup services and many CMS, and CRM systems.
Bluetooth is often underappreciated as most buyers may only use it to connect wireless keyboards and mouses. Make sure to look for a laptop with Bluetooth 4.0 as this will allow you to set-up other things like a Hi-Fi system without cords. This is especially great as then you can play music from your laptop from sources like Spotify or online radios through the stereo system.