Meta’s New Open Source AI Model – Now Free

Meta’s New Open Source AI Model – Now Free

LIama 2
LIama 2

New Open-Source AI Model From Meta and Microsoft

The founder of Facebook and Instagram, Meta, and Microsoft has created a new artificial intelligence language model LIama 2, which is open source and will be publicly available for both research and business. Before this, Meta released a previous version of AI that was only available to approved organizations, but then the data was leaked, and LIama appeared on the network as publicly accessible. Meta tried to fight the situation and remove LLS from the Internet, for example, from the GitHub site, but LIama had already spread widely, and this was unsuccessful. After that, Meta decided to make this AI open.

Microsoft will make LIama available through the Azure AI catalog to work with it in the cloud. It will also be possible to work with AI on Windows through external providers AWS and Hugging Face. In fact, it is now the first major open-source LLM and is a competitive alternative to the expensive models of OpenAI and Google. According to Mark Zuckerberg, he sees the role of open-source technologies as key to the development of technologies in the future.

In addition to being open source in the new AI model, Meta has worked to improve security and fault tolerance. This has been implemented with Red-Teaming technologies that address security gaps. In addition, pre-trained models of this version of LLM have been trained on trillions of tokens, while fine-tuned models have been trained on a million human annotations.

It can be argued that now in IT technologies, there are 2 trends – these are AI and open-source products. Each of them has already captured the minds and attention of developers and companies around the world. An attempt to combine these 2 trends is probably an important step and impetus on the way to a new round of future technology development.

Apple No Longer Allows Usage of App APIs for Free

Apple No Longer Allows Usage of App APIs for Free

Apple API
Apple API cover

Apple Won’t Let Apps in App Store Without API Explanation

Apple has released information that it will be even more thorough in reviewing applications before adding them to the App Store. This time the restrictions will affect the API, and now the developer will have to give detailed explanations of why he wants to use some of them. The changes will take effect in the spring of 2024 and will affect about 30 different APIs.

The changes will apply not only to new applications but also to old ones. Developers of existing applications will have to provide detailed comments, and if Apple is not satisfied with them, then the applications will be disabled. This innovation has already caused concern among developers and companies, but Apple explains this measure by the need to increase user security.

Some APIs will now be called “Required Reason API” and if they were used in the application, the developer will receive a notification from Apple asking them to explain why they used them. The first notifications will start coming in the fall after the release of iOS 17, tvOS 17, watchOS 10, and macOS Sonoma.

Some APIs can collect user data via fingerprints, such as an IP address, browser, screen resolution, and many others. This is what Apple considers a vulnerability, and it is trying to prevent user data from being leaked. However, there are fears that developers will stop publishing their applications. For example, because the restrictions will be applied by the popular UserDefaults API, which has been massively used in application development. Apple says it will provide an opportunity to appeal a decision on rejected apps, but the already hard process of publishing them in the App Store will become even more difficult.

Business Meeting Between Kodershop and HISA in Toronto

Business Meeting Between Kodershop and HISA in Toronto

Kodershop and HISA Meeting in Toronto

A few weeks ago Kodershop had a live meeting with HISA employees who are currently working on a software development project. The meeting took place in Toronto and brought together employees from Canada, Europe, and the US. For 5 days, employees actively interacted with each other, discussed plans, exchanged ideas, and made presentations. Development teams generated many new ideas on software architecture, business processes, and product improvement.

From our experience, we know that live meetings always improve the process of interaction between teams, improve communication and benefit the product being created. We brought together developers and architects, team leads and marketers, business process participants from HISA, and experienced experts in the horse races industry who spoke about possible problems that may not be obvious.

To better understand how horse racing works, the participants of all teams went together to the Woodbine Racetrack one day. Starting from the early morning the track staff conducted tours of the stables, jockey rooms, and training places and told about all the nuances of how the process of racing works from the inside. This not only helped to better understand the product we are working on but also gave inspiration to all participants.

In addition to business meetings, employees had great evenings with informal communication, walked around Toronto, and visited Niagara Falls. It was a great team building, which allowed all team members to get out of everyday virtuality.

To better understand how horse racing works, the participants of all teams went together to the Woodbine Racetrack one day. Starting from the early morning the track staff conducted tours of the stables, jockey rooms, and training places and told about all the nuances of how the process of racing works from the inside. This not only helped to better understand the product we are working on but also gave inspiration to all participants.

In addition to business meetings, employees had great evenings with informal communication, walked around Toronto, and visited Niagara Falls. It was a great team building, which allowed all team members to get out of everyday virtuality.

At the end of the working week, Steve Keech brought several kilos of presentations and stickers from the conference room, which were drawn during the business meetings. In the course of further work on the project, we will implement all these ideas into the functionality of the software and make it even more efficient and useful.