What is Fastify?
In the rapidly evolving world of web development, developers are constantly on the lookout for frameworks that can provide both speed and efficiency. Enter Fastify, a lightweight and lightning-fast web framework for Node.js that has taken the development community by storm. If you’re a developer looking to create high-performance, scalable, and secure web applications, Fastify may be the game-changer you’ve been waiting for.
What is Fastify?
Fastify, developed by Matteo Collina and Tomas Della Vedova, is an open-source web framework for Node.js designed with a primary focus on speed and low overhead. Launched in 2016, Fastify has quickly gained popularity in the Node.js ecosystem due to its impressive performance, simplicity, and extensibility. It is built on top of Node.js’s HTTP module and takes full advantage of the latest JavaScript features to maximize its speed and efficiency.
Getting started with Fastify:
- npm init
- npm i fastify
- Required minimum node.js version of node 14.
- In express js return JSON data using res.json({ hello: “world” }) function but fastify no need to json function.
// Require the framework and instantiate it const fastify = require("fastify")({ logger: true }); // Declare a rout fastify.get("/", async (request, reply) => { return { hello: "world" }; }); // Start the server fastify.listen(3000);
Fastify comes with amazing set of features which will give boost to your project:
The Need for Speed
One of the primary reasons developers are flocking to Fastify is its exceptional performance. Thanks to its powerful and highly optimized core, Fastify boasts some of the fastest request/response times among Node.js frameworks. It leverages features like request validation, which is automatically generated from JSON schemas, to ensure that data is processed swiftly and accurately. Additionally, Fastify supports asynchronous programming and handles requests concurrently, making it ideal for handling heavy workloads and high traffic.
Minimalism and Extensibility
Fastify follows a minimalist approach, focusing on providing only the essential components needed to build web applications efficiently. Developers can opt-in to use various plugins to extend Fastify’s functionality as per their requirements. This approach not only keeps the core lightweight but also gives developers the flexibility to customize their stack with the specific tools they need. Furthermore, the ecosystem around Fastify is growing rapidly, with a wide array of plugins and middleware available, making it easy to integrate third-party tools seamlessly.
Developer-Friendly API
Fastify’s API is designed to be intuitive and easy to use, reducing the learning curve for developers. Its well-documented and expressive API allows developers to write clean, maintainable, and organized code. The framework’s emphasis on proper error handling and logging also contributes to its ease of use, helping developers quickly identify and rectify issues during development and production.
JSON Schema-Based Validation
Data validation is a crucial aspect of web application development to ensure data integrity and security. Fastify utilizes JSON Schema for data validation, enabling developers to define the expected shape of incoming requests and responses. This not only simplifies the validation process but also automatically generates detailed and helpful error messages, making debugging a breeze.
Enhanced Security
Fastify is designed with security in mind. It encourages best practices such as using the latest cryptographic libraries and secure authentication mechanisms. Additionally, Fastify has a built-in protection mechanism against common web application attacks like Cross-Site Scripting (XSS) and Cross-Site Request Forgery (CSRF). With Fastify, developers can rest assured that their applications are less prone to security vulnerabilities.
Conclusion
Fastify’s emergence as a top-tier web framework for Node.js is no coincidence. Its commitment to speed, minimalism, and extensibility sets it apart from the competition. Whether you’re building a small-scale API or a large-scale application, Fastify’s performance, easy-to-use API, and emphasis on security make it an excellent choice.
In the fast-paced world of web development, having a framework that can boost productivity and deliver top-notch performance is essential. Fastify has proven itself as a reliable and efficient framework, providing developers with the tools they need to create high-performance applications without compromising on code quality and security.
So, if you’re ready to take your Node.js projects to the next level, give Fastify a try, and experience the speed and power it brings to your development workflow.
For more information and to develop web applications using Node.js, Hire Node.js Developer from us as we give you a high-quality product by utilizing all the latest tools and advanced technology. E-mail us any clock at – hello@hkinfosoft.com or Skype us: “hkinfosoft”.
To develop custom web apps using Node.js, please visit our technology page.
Content Source:
- fastify.dev
What is new in Laravel 10?
This transition is intended to ease the maintenance burden on the community and challenge our development team to ship amazing, powerful new features without introducing breaking changes. Therefore, we have shipped a variety of robust features to Laravel 9 without breaking backwards compatibility.
Therefore, this commitment to ship great new features during the current release will likely lead to future “major” releases being primarily used for “maintenance” tasks such as upgrading upstream dependencies, which can be seen in these release notes.
Laravel 10 continues the improvements made in Laravel 9.x by introducing argument and return types to all application skeleton methods, as well as all stub files used to generate classes throughout the framework. In addition, a new, developer-friendly abstraction layer has been introduced for starting and interacting with external processes.
PHP 8.1:
PHP 8.1 is the minimum-required PHP version in Laravel 10. Some PHP 8.1 features, such as readonly properties and array_is_list, are used in Laravel 10.
Laravel Official Packages Upgrade
Not only is the framework professionally maintained and updated on a regular basis, but so are all of the official packages and the ecosystem.
The following is a list of the most recent official Laravel packages that have been updated to support Laravel 10:
- Breeze
- Cashier Stripe
- Dusk
- Horizon
- Installer
- Jetstream
- Passport
- Pint
- Sail
- Scout
- Valet
Predis Version Upgrade
Predis is a robust Redis client for PHP that may help you get the most out of caching to provide a fantastic user experience. Laravel formerly supported both versions 1 and 2, but as of Laravel 10, the framework no longer supports Predis 1.
Although Laravel documentation mentions Predis as the package for interacting with Redis, you may also use the official PHP extension. This extension provides an API for communicating with Redis servers.
All Validation Rules Invokable by Default
If you were to make an invokable validation rule in Laravel 9, you would need to add an –invokable flag after the Artisan command. This is no longer necessary because all Laravel 10 rules are invokable by default. So, you may run the following command to create a new invokable rule in Laravel 10:
php artisan make:rule CustomRule
Types
On its initial release, Laravel utilized all of the type-hinting features available in PHP at the time. However, many new features have been added to PHP in the subsequent years, including additional primitive type-hints, return types, and union types.
Laravel 10.x thoroughly updates the application skeleton and all stubs utilized by the framework to introduce argument and return types to all method signatures. In addition, extraneous “doc block” type-hint information has been deleted.
This change is entirely backwards compatible with existing applications. Therefore, existing applications that do not have these type-hints will continue to function normally.
Test Profiling
The Artisan test command has received a new –profile option that allows you to easily identify the slowest tests in your application:
php artisan test --profile
Generator CLI Prompts
To improve the framework’s developer experience, all of Laravel’s built-in make commands no longer require any input. If the commands are invoked without input, you will be prompted for the required arguments:
php artisan make:controller
String Password Helper Function
Laravel 10 can create a random and secure password with a given length:
$password = Str::password(12);
Laravel Pennant
A new first-party package, Laravel Pennant, has been released. Laravel Pennant offers a light-weight, streamlined approach to managing your application’s feature flags. Out of the box, Pennant includes an in-memory array driver and a database driver for persistent feature storage.
Features can be easily defined via the Feature::define method:
use Laravel\Pennant\Feature; use Illuminate\Support\Lottery; Feature::define('new-onboarding-flow', function () { return Lottery::odds(1, 10); });
Once a feature has been defined, you may easily determine if the current user has access to the given feature:
if (Feature::active('new-onboarding-flow')) { // ... }
Of course, for convenience, Blade directives are also available:
@feature('new-onboarding-flow') <div> <!-- ... --> </div> @endfeature
Process Interaction
Laravel 10.x introduces a beautiful abstraction layer for starting and interacting with external processes via a new Process facade:
use Illuminate\Support\Facades\Process; $result = Process::run('ls -la'); return $result->output();
Processes may even be started in pools, allowing for the convenient execution and management of concurrent processes:
use Illuminate\Process\Pool; use Illuminate\Support\Facades\Process; [$first, $second, $third] = Process::concurrently(function (Pool $pool) { $pool->command('cat first.txt'); $pool->command('cat second.txt'); $pool->command('cat third.txt'); }); return $first->output();
Horizon / Telescope Facelift
Horizon and Telescope have been updated with a fresh, modern look including improved typography, spacing, and design
Pest Scaffolding
Pest test scaffolding is now enabled by default when creating new Laravel projects. To enable this feature, use the –pest flag when building a new app with the Laravel installer:
laravel new example-application --pest
Conclusion
Laravel 10 is a significant release for the Laravel framework, and it comes with several new features and improvements that will help developers create more robust and efficient web applications. The introduction of BladeX, model event hooks, and improved routing features make it easier for developers to build modular and scalable applications. The dropping of PHP 8.0 support is also a significant decision that ensures that developers are using the latest version of PHP, which is more secure and efficient. As always, Laravel continues to evolve and innovate, making it an excellent choice for web development projects.
For more information and to develop web applications using Laravel, Hire Laravel Developer from us as we give you a high-quality product by utilizing all the latest tools and advanced technology. E-mail us any clock at – hello@hkinfosoft.com or Skype us: “hkinfosoft”.
To develop custom web apps using Laravel, please visit our technology page.
Content Source:
- laravel.com
What is new in Node.js 20?
Unleashing the Power of Node 20: Embracing the Future of JavaScript Development
In the realm of server-side JavaScript, Node.js has become a dominant force, revolutionizing the way we build web applications. With each new version, Node.js brings forth exciting enhancements, improved performance, and expanded capabilities. In this blog, we’ll embark on a journey through the evolution of Node.js, exploring the advancements that have led to the highly anticipated Node 20. We’ll delve into the key features of Node 20 and showcase an example that demonstrates its potential.
From Past to Present: The Evolution of Node.js
Since its initial release in 2009, Node.js has evolved significantly, shaping the landscape of JavaScript development. The first versions of Node.js introduced a non-blocking, event-driven architecture, enabling developers to build highly scalable and efficient applications. With its growing popularity, Node.js gained a vibrant ecosystem of modules and libraries, making it a versatile platform for both back-end and full-stack development.
As Node.js progressed, new features were introduced to enhance performance, security, and developer productivity. For instance, Node.js 8 introduced the Long-Term Support (LTS) release, which provided stability and backward compatibility. Node.js 10 brought improvements in error handling and diagnostic reports, making it easier to identify and resolve issues. Node.js 12 introduced enhanced default heap limits and improved performance metrics.
Introducing Node 20: A Leap Forward in JavaScript Development
Now, let’s turn our attention to Node 20, the latest iteration of Node.js, and explore its groundbreaking features that are set to shape the future of JavaScript development.
– Improved Performance and Speed:
– Enhanced Security:
– Improved Debugging Capabilities:
– ECMAScript Modules (ESM) Support:
– Enhanced Worker Threads:
– Stable Test Runner
– url.parse() Warns URLs With Ports That Are Not Numbers
Let’s explore what they are and how to use them.
1. Improved Performance and Speed
– Node.js 20 incorporates the latest advancements in the V8 JavaScript engine, resulting in significant performance improvements. Let’s take a look at an example:
// File: server.js const http = require('http'); const server = http.createServer((req, res) =&amp;gt; { res.writeHead(200, { 'Content-Type': 'text/plain' }); res.end('Hello, world!'); }); server.listen(3000, () =&amp;gt; { console.log('Server running on port 3000'); });
By leveraging the performance optimizations in Node.js 20, applications like the one above experience reduced response times and enhanced scalability, resulting in an improved user experience.
2. Enhanced Security
Security is a top priority for any application, and Node.js 20 introduces several features to bolster its security. One noteworthy enhancement is the upgraded TLS implementation, ensuring secure communication between servers and clients. Here’s an example of using TLS in Node.js 20:
// File: server.js const https = require('https'); const fs = require('fs'); const options = { key: fs.readFileSync('private.key'), cert: fs.readFileSync('certificate.crt') }; const server = https.createServer(options, (req, res) =&amp;gt; { res.writeHead(200, { 'Content-Type': 'text/plain' }); res.end('Secure Hello, world!'); }); server.listen(3000, () =&amp;gt; { console.log('Server running on port 3000'); });
With the upgraded TLS implementation, Node.js 20 ensures secure data transmission, safeguarding sensitive information.
3. Improved Debugging Capabilities
Node.js 20 introduces enhanced diagnostic and debugging capabilities, empowering developers to pinpoint and resolve issues more effectively. Consider the following example:
// File: server.js const { performance, PerformanceObserver } = require('perf_hooks'); const obs = new PerformanceObserver((items) =&amp;gt; { console.log(items.getEntries()[0].duration); performance.clearMarks(); }); obs.observe({ entryTypes: ['measure'] }); performance.mark('start'); // ... Code to be measured ... performance.mark('end'); performance.measure('Duration', 'start', 'end');
In this example, the Performance API allows developers to measure the execution time of specific code sections, enabling efficient optimization and debugging.
4. ECMAScript Modules (ESM) Support
Node.js 20 embraces ECMAScript Modules (ESM), providing a standardized approach to organize and reuse JavaScript code. Let’s take a look at an example:
// File: module.js export function greet(name) { return `Hello, ${name}!`; } // File: app.js import { greet } from './module.js'; console.log(greet('John'));
With ESM support, developers can now leverage the benefits of code encapsulation and organization in Node.js, facilitating better code reuse and maintenance.
5. Enhanced Worker Threads
Node.js 20 introduces improved worker threads, enabling true multi-threading capabilities within a Node.js application. Consider the following example:
// File: worker.js const { Worker, isMainThread, parentPort, workerData } = require('worker_threads'); if (isMainThread) { const worker = new Worker(__filename, { workerData: 'Hello, worker!' }); worker.on('message', message =&amp;gt; console.log(message)); } else { parentPort.postMessage(workerData); }
In this example, the main thread creates a worker thread that receives data and sends a message back. With enhanced worker threads, Node.js 20 empowers developers to harness the full potential of multi-core processors, improving application performance.
6. Stable Test Runner
Node.js 20 includes an important change to the test_runner module. The module has been marked as stable after a recent update. Previously, the test_runner module was experimental, but this change marks it as a stable module ready for production use.
7. url.parse() Warns URLs With Ports That Are Not Numbers
url.parse() accepts URLs with ports that are not numbers. This behavior might result in hostname spoofing with unexpected input. These URLs will throw an error in future versions of Node.js, as the WHATWG URL API already does. Starting with Node.js 20, these URLS cause url.parse() to emit a warning.
Here is urlParse.js:
const url = require('node:url'); url.parse('https://example.com:80/some/path?pageNumber=5'); // no warning url.parse('https://example.com:abc/some/path?pageNumber=5'); // show warning
Execute node urlParse.js . https://example.com:80/some/path?pageNumber=5 with a numerical port does not show a warning, but https://example.com:abc/some/path?pageNumber=5 with a string port shows a warning.
% node urlParse.js (node:21534) [DEP0170] DeprecationWarning: The URL https://example.com:abc/some/path?pageNumber=5 is invalid. Future versions of Node.js will throw an error. (Use `node --trace-deprecation ...` to show where the warning was created)
Conclusion
Node.js 20 brings a plethora of innovative features and enhancements that revolutionize the way developers build applications. Improved performance, enhanced security, advanced debugging capabilities, ECMAScript Modules support, and enhanced worker threads open up new possibilities for creating scalable, secure, and high-performing applications. By leveraging these cutting-edge features, developers can stay at the forefront of modern web development and deliver exceptional user experiences. Upgrade to Node.js 20 today and unlock a new era of JavaScript development!
For more information and to develop web applications using Node JS, Hire Node Developer from us as we give you a high-quality product by utilizing all the latest tools and advanced technology. E-mail us any clock at – hello@hkinfosoft.com or Skype us: “hkinfosoft”.
To develop your custom web app using Node JS, please visit our technology page.
Content Source:
- medium.com
What’s new in Node.js 19
Node.js major release is rolled out every six months. The new release becomes the Current release for six months, which gives library authors time to add support for them.
After six months, odd-numbered releases, such as 19, become unsupported, and even-numbered releases, such as 18, move to the Active LTS (long-term support) status and are ready for general use.
LTS release typically guarantees that critical bugs will be fixed for a total of 30 months. Production applications should only use Active LTS or Maintenance LTS releases.
Node.js 19 was released recently which comes with 6 major features:
- Experimental node watch mode
- HTTP(S)/1.1 KeepAlive by default
- Stable WebCrypto
- Custom ESM resolution adjustments
- Dropped DTrace/SystemTap/ETW support
- V8 JavaScript engine is updated to V8 10.7
Let’s explore what they are and how to use them.
Use NVM to explore node
Run the command to install node 19.0.0:
% nvm install 19.0.0 Downloading and installing node v19.0.0... Downloading https: //nodejs.org/dist/v19.0.0/node-v19.0.0-darwin-x64.tar.xz... Computing checksum with sha256sum Checksums matched! Now using node v19.0.0 (npm v8.19.2) |
On any window, run the command to use node 19:
% nvm use 19 Now using node v19.0.0 (npm v8.19.2) |
Now you’re ready to explore:
% node --version v19.0.0 |
Experimental node watch mode
In A Hands-on Guide for a Server-Side Rendering React 18 App, you have to build a production Create React App by executing npm run build.
You can created server/index.js:
const express = require ( "express" ); const path = require ( "path" ); const app = express(); app. use (express. static (path.join(__dirname, "../build" ))); app.listen(8080, () =&amp;amp;amp;gt; console.log( "Express server is running on localhost:8080" ) ); |
The server was running with nodemon, a tool that helps to develop Node.js applications by automatically restarting the application when file changes are detected. The command is nodemon server.
With node.js 19, you no longer need to install the additional tool. Instead, you can execute node –watch to automatically restart the application when file changes are detected.
% node --watch server (node:67643) ExperimentalWarning: Watch mode is an experimental feature. This feature could change at any time (Use `node --trace-warnings ...` to show where the warning was created) Express server is running on localhost:8080 |
HTTP(S)/1.1 KeepAlive by default
Node.js 19 sets keepAlive to true by default. This means that any outgoing HTTP(s) connection will automatically use HTTP 1.1 keepAlive. The default keepAlive duration is 5 seconds.
Change the above server/index.js to:
const http = require ( 'node:http' ); console.log(http.globalAgent); const https = require ( 'node:https' ); console.log(https.globalAgent); |
Execute the server code using node.js 16:
% nvm use 16 Now using node v16.0.0 (npm v7.10.0) % node server Agent { _events: [Object: null prototype] { free: [Function (anonymous)], newListener: [Function: maybeEnableKeylog] }, _eventsCount: 2, _maxListeners: undefined, defaultPort: 80, protocol: 'http:' , options: [Object: null prototype] { path: null }, requests: [Object: null prototype] {}, sockets: [Object: null prototype] {}, freeSockets: [Object: null prototype] {}, keepAliveMsecs: 1000, keepAlive: false, maxSockets: Infinity, maxFreeSockets: 256, scheduling: 'lifo' , maxTotalSockets: Infinity, totalSocketCount: 0, [Symbol(kCapture)]: false } Agent { _events: [Object: null prototype] { free: [Function (anonymous)], newListener: [Function: maybeEnableKeylog] }, _eventsCount: 2, _maxListeners: undefined, defaultPort: 443, protocol: 'https:' , options: [Object: null prototype] { path: null }, requests: [Object: null prototype] {}, sockets: [Object: null prototype] {}, freeSockets: [Object: null prototype] {}, keepAliveMsecs: 1000, keepAlive: false, maxSockets: Infinity, maxFreeSockets: 256, scheduling: 'lifo' , maxTotalSockets: Infinity, totalSocketCount: 0, maxCachedSessions: 100, _sessionCache: { map: {}, list: [] }, [Symbol(kCapture)]: false } |
- It shows http.globalAgent sets keepAlive false.
- It shows https.globalAgent sets keepAlive false.
Execute the server code using node.js 19:
% nvm use 19 Now using node v19.0.0 (npm v8.19.2) % node server Agent { _events: [Object: null prototype] { free: [Function (anonymous)], newListener: [Function: maybeEnableKeylog] }, _eventsCount: 2, _maxListeners: undefined, defaultPort: 80, protocol: 'http:' , options: [Object: null prototype] { keepAlive: true, scheduling: 'lifo' , timeout: 5000, noDelay: true, path: null }, requests: [Object: null prototype] {}, sockets: [Object: null prototype] {}, freeSockets: [Object: null prototype] {}, keepAliveMsecs: 1000, keepAlive: true, maxSockets: Infinity, maxFreeSockets: 256, scheduling: 'lifo' , maxTotalSockets: Infinity, totalSocketCount: 0, [Symbol(kCapture)]: false } Agent { _events: [Object: null prototype] { free: [Function (anonymous)], newListener: [Function: maybeEnableKeylog] }, _eventsCount: 2, _maxListeners: undefined, defaultPort: 443, protocol: 'https:' , options: [Object: null prototype] { keepAlive: true, scheduling: 'lifo' , timeout: 5000, noDelay: true, path: null }, requests: [Object: null prototype] {}, sockets: [Object: null prototype] {}, freeSockets: [Object: null prototype] {}, keepAliveMsecs: 1000, keepAlive: true, maxSockets: Infinity, maxFreeSockets: 256, scheduling: 'lifo' , maxTotalSockets: Infinity, totalSocketCount: 0, maxCachedSessions: 100, _sessionCache: { map: {}, list: [] }, [Symbol(kCapture)]: false } |
- It shows http.globalAgent sets keepAlive true.
- It shows the default keepAlive duration is 5 seconds (5000 ms).
- It shows https.globalAgent sets keepAlive true.
- It shows the default keepAlive duration is 5 seconds (5000 ms).
Enable keepAlive will deliver better throughput as connections are reused by default.
Additionally, the agent is able to parse the response keepAlive that the servers might send. This header instructs the client on how much longer to stay connected.
On the other side, the HTTP server will automatically disconnect idle clients when close() is invoked. It is accomplished by http(s).Server.close calling closeIdleConnections internally.
With these changes, HTTP(S)/1.1 requests may experience a better throughput/performance by default.
Stable WebCrypto
The WebCrypto API is an interface to build systems using cryptography. With node.js 19, the WebCrypto API is stable (with the exception of these algorithms: Ed25519, Ed448, X25519, and X448).
You can use globalThis.crypto or require(‘node:crypto’).webcrypto to access this module. The following server/index.js use subtle as an example, where the SubtleCrypto interface provides a number of low-level cryptographic functions:
const { subtle } = globalThis.crypto; (async function () { const key = await subtle.generateKey({ name: 'HMAC' , hash: 'SHA-256' , length: 256 }, true, [ 'sign' , 'verify' ]); console.log( 'key =' , key); const enc = new TextEncoder(); const message = enc.encode( 'I love cupcakes' ); console.log( 'message =' , message); const digest = await subtle.sign({ name: 'HMAC' }, key, message); console.log( 'digest =' , digest); })(); |
- An HMAC key is generated. HMAC is a specific type message authentication code (MAC) that involves a cryptographic hash function and a secret cryptographic key. The generated key can be used to simultaneously verify the data integrity and authenticity of a message.
- A message, I love cupcakes, is encoded.
- A message digest is created with key and message. A message digest is a cryptographic hash function that contains a string of digits created by a one-way hashing formula.
The following console information shows the values of key, message, and digest:
% node server key = CryptoKey { type: 'secret' , extractable: true, algorithm: { name: 'HMAC' , length: 256, hash: [Object] }, usages: [ 'sign' , 'verify' ] } message = Uint8Array(15) [ 73, 32, 108, 111, 118, 101, 32, 99, 117, 112, 99, 97, 107, 101, 115 ] digest = ArrayBuffer { [Uint8Contents]: &amp;amp;lt;30 01 7a 5c d9 e2 82 55 6b 55 90 4f 1d de 36 d7 89 dd fb fb 1a 9e a0 cc 5d d8 49 13 38 2f d1 bc&amp;amp;gt;, byteLength: 32 } |
Custom ESM resolution adjustments
Node.js has removed the –experimental-specifier-resolution flag, because its functionality can be achieved via custom loaders.
Clone the example repository:
git clone https://github.com/nodejs/loaders-test.git
Go to the example directory:
% cd loaders-test/commonjs-extension-resolution-loader
Install the packages:
% yarn install
Here is loaders-test/commonjs-extension-resolution-loader/test/basic-fixtures/index.js:
import { version } from 'process' ; import { valueInFile } from './file' ; import { valueInFolderIndex } from './folder' ; console.log(valueInFile); console.log(valueInFolderIndex); |
- Line 1 is unused.
- ValueInFile is imported from ‘./file’ without specifying the file extension. Without a custom loader, node’s ESM specifier resolution does not automatically resolve file extensions, such as ./file.js or ./file.mjs.
Here is loaders-test/commonjs-extension-resolution-loader/test/basic-fixtures/file.js:
export const valueInFile = ‘hello from file.js’;
- valueInFolderIndex is imported from ‘./folder’ without specifying the index file name. Without a custom loader, node’s ESM specifier resolution does not have the ability to import directories that include an index file, such as ./folder/index.js or ./folder/index.mjs.
Here is loaders-test/commonjs-extension-resolution-loader/test/basic-fixtures/folder/index.js:
export const valueInFolderIndex = ‘hello from folder/index.js’;
We have mentioned in another article that there are two ways to execute ESM code:
- Set “type”: “module” in the package.json.
- Change index.js to index.mjs, and run node index.mjs.
Regardless, the following two commands will fail.
% node test/basic-fixtures/index % node test/basic-fixtures/index.js
However, all these issues can be resolved by the custom loader, loaders-test/commonjs-extension-resolution-loader/loader.js:
import { isBuiltin } from 'node:module' ; import { dirname } from 'node:path' ; import { cwd } from 'node:process' ; import { fileURLToPath, pathToFileURL } from 'node:url' ; import { promisify } from 'node:util' ; import resolveCallback from 'resolve/async.js' ; const resolveAsync = promisify(resolveCallback); const baseURL = pathToFileURL(cwd() + '/' ).href; export async function resolve(specifier, context, next) { const { parentURL = baseURL } = context; if (isBuiltin(specifier)) { return next(specifier, context); } // `resolveAsync` works with paths, not URLs if (specifier.startsWith( 'file://' )) { specifier = fileURLToPath(specifier); } const parentPath = fileURLToPath(parentURL); let url; try { const resolution = await resolveAsync(specifier, { basedir: dirname(parentPath), // For whatever reason, --experimental-specifier-resolution=node doesn't search for .mjs extensions // but it does search for index.mjs files within directories extensions: [ '.js' , '.json' , '.node' , '.mjs' ], }); url = pathToFileURL(resolution).href; } catch (error) { if (error.code === 'MODULE_NOT_FOUND' ) { // Match Node's error code error.code = 'ERR_MODULE_NOT_FOUND' ; } throw error; } return next(url, context); } |
With the loader, the above failed commands work well:
% node --loader=./loader.js test/basic-fixtures/index (node:56149) ExperimentalWarning: Custom ESM Loaders is an experimental feature. This feature could change at any time (Use `node --trace-warnings ...` to show where the warning was created) hello from file.js hello from folder/index.js % node --loader=./loader.js test/basic-fixtures/index.js (node:56160) ExperimentalWarning: Custom ESM Loaders is an experimental feature. This feature could change at any time (Use `node --trace-warnings ...` to show where the warning was created) hello from file.js hello from folder/index.js |
With custom loaders, there is no need for the –experimental-specifier-resolution flag.
Dropped DTrace/SystemTap/ETW Support
For the following two reasons, Node.js has dropped the support for DTrace/SystemTap/ETW:
- There are no clear indicators anyone is using DTrace, SystemTap, or ETW.
- The complexity to maintain supporting these tools has proved not worth the effort.
V8 JavaScript engine is updated to V8 10.7
Node.js 19 has updated V8 JavaScript engine to V8 10.7, which includes a new function, Intl.NumberFormat, for language-sensitive number formatting.
Intl.NumberFormat(locales, options)
locales is an optional parameter, which is a BCP 47 language tag, or an array of such strings. Here is the BCP 47 language tag list:
Pic courtesy: medium.com
options is also an optional parameter, which is an object with some or all of these properties: compactDisplay, currency, currencyDisplay, currencySign, localeMatcher, notation, numberingSystem, signDisplay, style, unit, unitDisplay, useGrouping, roundingMode, roundingPriority, roundingIncrement, trailingZeroDisplay, minimumIntegerDigits, minimumFractionDigits, maximumFractionDigits, minimumSignificantDigits, and maximumSignificantDigits.
Among them, style chooses the formatting style, with the following supported values:
- “decimal” for plain number formatting (default).
- “currency” for currency formatting.
- “percent” for percent formatting.
- “unit” for unit formatting.
If style is set to ‘currency’, the currency property is required. currency takes ISO 4217 currency code that is listed in this table. By default, minimumFractionDigits and maximumFractionDigits are both set to 2.
Let’s take a look at this example:
const number = 123456.789; console.log( new Intl.NumberFormat( 'de-DE' , { style: 'currency' , currency: 'EUR' }).format(number)); console.log( new Intl.NumberFormat( 'ja-JP' , { style: 'currency' , currency: 'JPY' }).format(number)); console.log( new Intl.NumberFormat( 'ar-SA' , { style: 'currency' , currency: 'EGP' }).format(number)); console.log( new Intl.NumberFormat( 'zh-CN' , { style: 'currency' , currency: 'CNY' }).format(number)); |
- ‘de-DE’ is for German. ‘EUR’ is for Euro. The printed value is 123.456,79 €, with default of 2 fraction digits.
- ‘ja-JP’ is for Japanese. ‘JPY’ is for Japanese Yen. The printed value is ¥123,457, as the Japanese Yen does not use a minor unit.
- ‘ar-SA’ is for Arabic. ‘EGP’ is for Egyptian Pound. The printed value is ١٢٣٬٤٥٦٫٧٩ ج.م., with default of 2 fraction digits.
- ‘zh-CN’ is for Chinese. ‘CNY’ is for Chinese Yuan. The printed value is ¥123,456.79, with default of 2 fraction digits.
For more information and to develop web applications using Node JS, Hire Node Developer from us as we give you a high-quality product by utilizing all the latest tools and advanced technology. E-mail us any clock at – hello@hkinfosoft.com or Skype us: “hkinfosoft”.
To develop your custom web app using Node JS, please visit our technology page.
Content Source:
- medium.com
How to build a RESTful Web API using .NET Core 6
.NET 6 is the latest LTS (Long Term Support) release currently and will be supported until November 12, 2024.
This API will manage movie records stored in a relational database (SQL Server) as described in the table below:
Pic courtesy: medium.com
The sections of this post will be as follows:
- What is REST?
- Creating a Web API Project
- Adding a Model
- Adding a Database Context
- Creating Database with Migrations
- Creating API Controller and Methods
You will need the following tools installed on your computer:
- Visual Studio 2022
- .NET 6.0 SDK
- Microsoft SQL Server Express
What is REST?
RESTful APIs conform to the REST architectural style.
REST, or REpresentational State Transfer, is an architectural style for providing standards between computer systems on the web, making it easier for systems to communicate with each other.
REST relies on client-server relationship. This essentially means that client application and server application must be able to evolve separately without any dependency on each other.
REST is stateless. That means the communication between the client and the server always contains all the information needed to perform the request. There is no session state in the server, it is kept entirely on the client’s side.
REST provides a uniform interface between components. Resources expose directory structure-like URIs.
REST is not strictly related to HTTP, but it is most commonly associated with it. There are four basic HTTP verbs we use in requests to interact with resources in a REST system:
- GET— retrieve a specific resource (by id) or a collection of resources
- POST— create a new resource
- PUT— update a specific resource (by id)
- DELETE— remove a specific resource by id
In a REST system, representations transfer JSON or XML to represent data objects and attributes.
REST has had such a large impact on the Web that it has mostly displaced SOAP-based interface design because it’s a considerably simpler style to use.
Creating a Web API Project
Open Visual Studio 2022 and select Create a new project and then select ASP.NET Core Web API:
Pic courtesy: medium.com
and give a name to your project in the following screen and then click Next.
In the next screen, select .NET 6.0 as the framework and click Create:
Pic courtesy: medium.com
At this point you have a starter project as follows:
Pic courtesy: medium.com
In the Program.cs you can see that Swagger support is added automatically to your project:
Pic courtesy: medium.com
And also Swashbuckle.AspNetCore NuGet package is added as a dependency.
Now, let’s run (Ctlr+F5) the project to see the default output. When the browser opens and the Swagger UI is shown, select the GET method in the WeatherForecast part and then select Try It Out and Execute:
Pic courtesy: medium.com
Also, you can use the curl URL shown in the Swagger UI for this method and see the result of the URL in the browser:
Pic courtesy: medium.com
When you run the application, the default URL comes from the launchSettings.json:
Pic courtesy: medium.com
And the result values come from the GET method of the WeatherForecastController:
Pic courtesy: medium.com
As you see, values here are hard coded and randomness is added to generate different values.
In your Web API, you will create your own records in an SQL server database and will be able to view, update and delete them through REST API endpoints.
Adding a Model
Now, you will implement your data model class.
In Solution Explorer, right-click the project. Select Add -> New Folder and name the folder Models.
Then right-click the Models folder and select Add->Class. Name the class Movie.cs and click Add.
Next, add the following properties to the class:
Pic courtesy: medium.com
The Id field is required by the database for the primary key.
Entity Framework Core
You will use your model with Entity Framework Core (EF Core) to work with a database.
EF Core is an object-relational mapping (ORM) framework that simplifies the data access code. Model classes don’t have any dependency on EF Core. They just define the properties of the data that will be stored in the database.
In this post, you will write the model classes first and EF Core will create the database. This is called Code First Approach.
Let’s add the EF Core NuGet packages to the project. Right-click on the project and select Manage NuGet Packages… and then install the following packages:
Pic courtesy: medium.com
Adding a Database Context
The database context is the main class that coordinates Entity Framework functionality for a data model. This class is created by deriving from Microsoft.EntityFrameworkCore.DbContext class.
Now, right-click the Models folder and select Add ->Class. Name the class MovieContext and click Add. Then add the following code to the class:
Pic courtesy: medium.com
The preceding code creates a DbSet<Movie> property for the entity set.
In Entity Framework terminology, an entity set typically corresponds to a database table and an entity corresponds to a row in the table.
The name of the connection string is passed into the context by calling a method on a DbContextOptions object. For local development, the ASP.NET Core configuration system reads the connection string from the appsettings.json file.
We need to add our connection string to the appsettings.json. You will use the local SQL server instance in your machine and you can define the connection string as follows:
Pic courtesy: medium.com
You can change the database name if you want.
Dependency Injection
ASP.NET Core is built with Dependency Injection (DI). Services (such as the EF Core DB context) are registered with DI during application startup. Components that require these services are provided with these services via constructor parameters.
Now, you will register your database context to the built-in IOC container. Add the following code to Program.cs:
Pic courtesy: medium.com
Creating Database with Migrations
Now, you will create the database using the EF Core Migrations feature.
Migrations lets us create a database that matches our data model and update the database schema when our data model changes.
First, you will add an initial Migration.
Open Tools -> NuGet Package Manager > Package Manager Console(PMC) and run the following command in the PMC:
Add-Migration Initial
The Add-Migration command generates code to create the initial database schema which is based on the model specified in the MovieContext class. The Initial argument is the migration name and any name can be used.
After running the command, a migration file is created under the Migrations folder:
Pic courtesy: medium.com
As the next step, run the following command in the PMC:
Update-Database
The Update-Database command runs the Up method in the Migrations/{time-stamp}_Initial.cs file, which creates the database.
Now, you will check the database created. Open View -> SQL Server Object Explorer.
You will see the newly created database as below:
Pic courtesy: medium.com
As you see, the Movie table and the Migrations History table are created automatically. Then a record is inserted into the migration history table to show the executed migrations on the database.
Creating API Controller and Methods
In this section, you will create the Movies API Controller and add the methods to it, and also will test those methods.
Let’s add the controller first. Right-click on the Controller folder and select Add -> Controller.. and then select API Controller – Empty as below:
Pic courtesy: medium.com
Click Add and give a name to your controller on the next screen.
Pic courtesy: medium.com
MoviesController is created as below:
Pic courtesy: medium.com
As you see, the class is decorated with the [ApiController] attribute. This attribute indicates that the controller responds to web API requests.
MoviesController class inherits from ControllerBase.
Next, we will inject the database context mentioned in the previous section through the constructor of the controller. Add the following code:
Pic courtesy: medium.com
Now, you will add CRUD (create, read, update, and delete) action methods to the controller. Let’s start with the GET methods.
GET Method
Add the following code to the MoviesController:
Pic courtesy: medium.com
GetMovies method returns all the movies and GetMovie(int id) method returns the movie having the Id given as input. They are decorated with the [HttpGet] attribute which denotes that a method responds to an HTTP GET request.
These methods implement two GET endpoints:
- GET /api/Movies
- GET /api/Movies/{id}
You can test the app by calling the two endpoints from a browser as follows:
- https://localhost:{port}/api/movies
- https://localhost:{port}/api/movies/{id}
The return type of the GetMovie methods is ActionResult<T> type. ASP.NET Core automatically serializes the object to JSON and writes the JSON into the body of the response message. The response code for this return type is 200, assuming there are no unhandled exceptions. Unhandled exceptions are translated into 5xx errors.
Routing and URL Paths
The URL path for each method is constructed as follows:
Start with the template string in the controller’s Route attribute (Route(“api/[controller]”)). Then replace [controller] with the name of the controller, which by convention is the controller class name minus the Controller suffix. For this sample, the controller class name is MoviesController, so the controller name is movies.
ASP.NET Core routing is case insensitive.
Testing the GetMovie Method
Now you will test these endpoints. Before that, let’s insert some movie records into your table.
Go to the SQL Server Object Explorer and right-click the Movies table and select View Data:
Pic courtesy: medium.com
Then add some movie records manually to the table:
Pic courtesy: medium.com
You do not need to add data for the Id column as SQL Server automatically handles this for us.
Now, you can test the GET endpoints. Start (Ctlr+F5) the application:
Pic courtesy: medium.com
Select the first GET method and click Try it out -> Execute:
Pic courtesy: medium.com
This shows all of the movies in the application.
Next, click the second GET method and click Try it out and enter one of the Ids above in the id field and click Execute:
Pic courtesy: medium.com
If no item matches the requested Id, the method returns a 404 NotFound error code.
Pic courtesy: medium.com
POST Method
Add the following code to the MoviesController:
Pic courtesy: medium.com
PostMovie method creates a movie record in the database. The preceding code is an HTTP POST method, as indicated by the [HttpPost] attribute. The method gets the value of the movie record from the body of the HTTP request.
The CreatedAtAction method:
- Returns an HTTP201 status code, if successful. HTTP 201 is the standard response for an HTTP POST method that creates a new resource on the server.
- Adds a Locationheader to the response. The Location header specifies the URI of the newly created movie record.
- References the GetMovieaction to create the Location header’s URI.
Testing the PostMovie Method
Start the application and then select the POST method in the Movies section.
Click Try it out and enter the movie information that you want to add in the request body:
Pic courtesy: medium.com
and click Execute.
Response status code is 201 (Created) and a location header is added to the response as seen below:
Pic courtesy: medium.com
You can paste this location URL in the browser and see the response there too:
Pic courtesy: medium.com
Also, you can check this record from the Movies table in your local database:
Pic courtesy: medium.com
PUT Method
Add the following code to the MoviesController:
Pic courtesy: medium.com
PutMovie method updates the movie record with the given Id in the database. The preceding code is an HTTP PUT method, as indicated by the [HttpPut] attribute. The method gets the value of the movie record from the body of the HTTP request. You need to supply the Id both in the request URL and the body and they have to match. According to the HTTP specification, a PUT request requires the client to send the entire updated entity, not just the changes.
The response is 204 (No Content) if the operation is successful.
Testing the PutMovie Method
Start the application and then select the PUT method in the Movies section.
Click Try it out and enter the movie information that you want to update in the request body and the Id of the movie in the id field:
Pic courtesy: medium.com
and then click Execute.
Pic courtesy: medium.com
We can check the updated state of the movie from GET method with Id in the Swagger UI or directly from the browser as below:
Pic courtesy: medium.com
We can see the updated info in the database as well:
Pic courtesy: medium.com
If you try to update a record that does not exist in the database you get 404 Not Found error:
Pic courtesy: medium.com
DELETE Method
Add the following code to the MoviesController:
Pic courtesy: medium.com
DeleteMovie method deletes the movie record with the given Id in the database. The preceding code is an HTTP DELETE method, as indicated by the [HttpDelete] attribute. This method expects Id in the URL to identify the movie record we want to delete.
Testing the DeleteMovie Method
Start the application and then select the DELETE method in the Movies section.
Click Try it out and enter the Id of the movie you want to delete in the id field:
Pic courtesy: medium.com
and then click Execute.
Pic courtesy: medium.com
We do not need to supply a request body as you might have noticed. The response status is 204 No Content.
If you try to get this movie record using the browser you get 404 Not Found error as expected:
Pic courtesy: medium.com
You can check as well from the database that the record is deleted:
Pic courtesy: medium.com
For more information and to develop a website using ASP.NET, Hire .NET Developer from us as we give you a high-quality product by utilizing all the latest tools and advanced technology. E-mail us any clock at – hello@hkinfosoft.com or Skype us: “hkinfosoft”.
To develop a Website using ASP.NET, please visit our technology page.
Content Source:
- medium.com
Cron jobs in Node.js
Cron Jobs in node.js is used for scheduling scripts to run, They’re most commonly used for automating doing a task like scraping data from another website, deleting some data from the database, etc. However, there are many situations when a web application may need specific tasks to run periodically.
In this article, the points to cover are:
- Best module for creating a cron job in Node.js
- Review cron job patterns
- Essential points a Node.js project with cron job
Best modules for creating a cron job in Node.js
There are three popular modules that are sorted based on GitHub starts
- node-schedule
- cron
- node-cron
But the number of downloads cron is more than other modules currently.
All three modules do a similar thing, but it’s preferred to use the cron module because there are more examples available in this repository examples
Review cron job patterns
There are multiple tutorials for cron job patterns
Note that the cron job modules have 6 values but the crontab website has 5 values because the first start (second) is optional
The cron format consists of: ``` * * * * * * ┬ ┬ ┬ ┬ ┬ ┬ │ │ │ │ │ │ │ │ │ │ │ └ day of week (0 - 7) (0 or 7 is Sun) │ │ │ │ └───── month (1 - 12) │ │ │ └────────── day of month (1 - 31) │ │ └─────────────── hour (0 - 23) │ └──────────────────── minute (0 - 59) └───────────────────────── second (0 - 59, OPTIONAL)
1- At 04:05

Pic courtesy: medium.com
2- At every 10th minute.

Pic courtesy: medium.com
3- Every odd minute

Pic courtesy: medium.com
4- At every minute from 10 through 20 on Sunday and Monday.

Pic courtesy: medium.com
Essential points a Node.js project with cron job
You know that Node.js is a single thread so using clustering in production, and running Node.js projects on multiple CPUs, to do this you can PM2.
If you use Cron job in Nest.js Framework or Express.js and use clustering so the cron job starts on multiple CPUs. Let’s review an example that shows you the problem.
Cronjob In Cluster Mode
>>> Problem
npm install cron

Pic courtesy: medium.com
Run the project based on cluster mode with PM2, the cluster mode allows networked Node.js applications (HTTP(s)/TCP/UDP server) to be scaled across all CPUs available
- Start cron job example
pm2 start index.js -i max
- Show logs in pm2
pm2 logs

Pic courtesy: medium.com
As you can see after 20 seconds the cron job has executed the desired function 8 times instead of 2 times, because the project clusters on 4 CPUs

Pic courtesy: medium.com
>>> Solution
For solving this problem you should use NODE_APP_INSTANCE environment variable to run a cronjob only on one process,

Pic courtesy: medium.com
The result

Pic courtesy: medium.com
As you see, cronjob is running just on one process.
If you want to run in different environments with cluster mode and none cluster mode, with the NODE_ENV variable, handle this problem
NODE_ENV=local node index

Pic courtesy: medium.com
With these conditions, the cron job runs in the local environment and runs in the prod environment with cluster mode.
Cronjob In Different Timezone
When you start a cron job on the server, It runs based on the time server and timezone of the server. the best approach is using the timezone ‘Etc/UTC’ instead of the local timezone server.
>>> Problem
Imagine there is a website and this website publishes new data every day at 8:00 AM o’clock. Your script is running on the server, the problem is that some timezone clocks change (daylight saving time) so when the time of the server changes, The execution time of your script is also affected
>>> Solution
Check the local time from the UTC time

Pic courtesy: medium.com
For more information and to develop web applications using Node JS, Hire Node Developer from us as we give you a high-quality product by utilizing all the latest tools and advanced technology. E-mail us any clock at – hello@hkinfosoft.com or Skype us: “hkinfosoft”.
To develop your custom web app using Node JS, please visit our technology page.
Content Source:
- medium.com
ASP.NET Core Features in .NET 7
It is no news that Microsoft is working very hard on improving and bringing new features to the .NET framework or to its C# programming language. This time Microsoft is targeting web development and is focusing on ASP.NET Core, which will apparently come hand in hand with the .NET 7 version.
Some time ago Microsoft released Preview 1 of ASP.NET Core on .NET 7 and the amount of new features is great, so let’s see the new features!
New Minimal API Improvements
The first of the new features will bring improvements in minimal APIs, especially in IFormFile and IFormCollection. With this new improvement you will be able to use IFormFileand IFormCollection to handle and manage file uploads in a much easier way.
Microsoft warns that if you want to use these new functions with authentication, anti-forgery is required, but so far Microsoft has not implemented such support. However, they reassure us that it is on the roadmap of .NET 7.
Support for such requests with client certificates or cookie headers is currently inactive. Let’s take a look at the example provided by Microsoft to see this new ASP.NET Core feature in action:
The next new enhancement for the minimal APIs in ASP.NET Core comes for Steam and PipeRider.
To understand in what kind of scenario these new minimal APIs would be used, let’s imagine that we need to store data in a blob storage or queue it in some queue provider like Azure. In this case we can use Steam and PipeRider to bind the body of a request and later process it in the cloud.
However, Microsoft warns us three details to achieve a correct functioning of these minimal APIs:
- The Streamwill always be the same object as Body when ingesting any type of data.
- The Streamcannot be read again more than once (not rewindable) since by default the request body is not stored in the buffer.
- As the underlying buffers will end up being reused and discarded, both Streamand PipeRider cannot be used outside of the action handler.
The last improvement in the minimal APIs that Microsoft brings in this Preview 1 is about JSON configuration. Through ConfigureRouteHandlerJsonOptions we will be able to manually configure the options and settings of the minimal API endpoints using JSON.
This improvement has been introduced mainly, as Microsoft says, to avoid confusion with Microsoft.AspNetCore.Mvc.JsonOptions.
New client source generator for SignalR
This is the next new feature that ASP.NET Core will bring in .NET 7. This new source code generator for SignalR introduced by Microsoft has the ability to generate code (both sending and receiving) strongly typed based on developer-defined interfaces.
This especially applies to SignalR hub interfaces, there would be no need to use them in loosely-typed methods, we now have the option to reuse them in the client. At the same time, there is the possibility of implementing an interface that contains the methods and at the same time the client can take advantage of that interface to call any method that is part of the hub.
The good thing is that Microsoft has let us see the use and operation of this new SignalR generator. Let’s see how Microsoft uses it:
- First you need to add a reference to AspNetCore.SignalR.Client.SourceGenerator.
- Then the following classes must be added to the project: HubClientProxyAttributeand HubServerProxyAttribute :
The next step is to add a static partial class and write the following static partial methods together with the attributes HubServerProxy and HubClientProxy in this way:And finally, we would use the partial methods and that’s it:That’s it, that’s how easy it is to use the new SignalR client source generator.
Razor Pages and MVC views with nullable models
We continue with the next improvement brought by Microsoft. This time they have focused on improving the user experience of the checks and for this, they have implemented the nullable view for the checks that are made of null state of any ASP.NET Core application. In this case this is the example that Microsoft provides us:
Unfortunately there are no further examples or use cases for this new feature. Hopefully Microsoft will continue to release new features and talk more about the ones already revealed.
Validate errors with JSON property names
Thanks to this new ASP.NET feature you will be able to manually configure with “SystemTextJsonValidationMetadataProvider” the validation to use the JSON property names.
Previously as the names of the properties that a model has, commonly were implementation details, managing them from a single page application was difficult.
If you want to know more about this feature, I recommend you, as always, to consult the original source: Use JSON property names in validation errors
Injection services in Blazor
Injecting services into Blazor? attributes Yes, thanks to this improvement you will be able to inject almost any type of service into custom validation attributes. To do so, the ValidationContext will be configured by Blazor to be used as a service provider.
For more information and to develop a website using ASP.NET, Hire .NET Developer from us as we give you a high-quality product by utilizing all the latest tools and advanced technology. E-mail us any clock at – hello@hkinfosoft.com or Skype us: “hkinfosoft”.
To develop a Website using Vue.js, please visit our technology page.
Content Source:
- medium.com
What’s new in Node.js 18
Node.js 18.0 is released by the Node.js community. The most wonderful news is that in October 2022, this version will be elevated to long-term support (LTS). The codename for the release will be ‘Hydrogen’ once it is promoted to long-term support. Support for Node.js 18 will last until April 2025.
Okay now, let’s see What’s New?
Experimental Fetch:
The most exciting news is that version 18 will finally provide native fetch functionality in Node.js. For the longest time, Node did not contain support for fetch, which is a highly standard API on the web for conducting HTTP requests or any other type of network request, and Node did not support it by default. If you wanted to make an HTTP request, you had to either use third-party tools or write the request from scratch. The implementation comes from undici and is inspired by node-fetch which was originally based upon undici-fetch. The implementation strives to be as close to spec-compliant as possible, but some aspects would require a browser environment and are thus omitted.
The API will remain experimental until further test coverage is introduced and the contributors have verified that the API implements as much of the requirements as is practicable.
Because JavaScript is utilised in so many areas, this is actually wonderful news for the entire ecosystem.It’s utilised on the web, in Node.js, and in Cloudflare workers, for example.
Cloudflare workers are currently shipping with their own proprietary implementation fetch. You’ll ought to install some few packages until you can use Node.There is a version for the web, so there is a lot of inconsistency along the route. Node is now providing formal support for this. That is, any environment that runs JavaScript on servers is almost certainly running Node. If it isn’t running Deno, it will support fetch by default, and because this is the team, the real team, doing it.
Example:

Pic courtesy: medium.com
Undici Library in Node.js:
If you look at this issue closely, you can see that Node utilised or primarily ported a library called Undici. What exactly is this library? It’s officially produced by the Node team, however it’s really an HTTP 1.1 full-fledged client written entirely in Node JS.

Pic courtesy: medium.com
Experimental test runner:
The node:test module facilitates the creation of JavaScript tests that report results in TAP format. To access it:
import test from ‘node:test’;
This module is only available under the node: scheme. __Node Document
Node.js 18 features a test runner that is still in development.It is not meant to replace full-featured alternatives such as Jest or Mocha, but it does provide a quick and straightforward way to execute a test suite without any additional dependencies.
It provides TAP output, which is extensively used, and makes the output easier to consume.
More information may be found in the community blog post and the Node.js API docs
Example:

Pic courtesy: medium.com
Community blog post
Note: The test runner module is only available using the node: prefix. The node: prefix denotes the loading of a core module. Omitting the prefix and importing ‘test’ would attempt to load a userland module. __Node Documents
Platform support:
As with other major releases, this one upgrades the minimum supported levels for systems and tooling needed to create Node.js. Node.js includes pre-built binaries for a variety of platforms. The minimum toolchains for each major release are evaluated and raised if needed.
- Red Hat Enterprise Linux (RHEL) 8 now builds prebuilt binaries for Linux which are compatible with Linux distributions based on glibc 2.28 or later, such as Debian 10, RHEL 8, and Ubuntu 20.04.
- MacOS 10.15 or later is now required for prebuilt binaries.
- For AIX the minimum supported architecture has been raised from Power 7 to Power 8.
Due to issues with creating the V8 dependencies in Node.js, prebuilt binaries for 32-bit Windows will not be accessible at first. With a future V8 upgrade, we hope to restore 32-bit Windows binaries for Node.js 18.
According to Node.js BUILDING.md file
Supported platforms is current as of the branch/release to which it belongs
Input
Node.js relies on V8 and libuv. We adopt a subset of their supported platforms.
Strategy
There are three support tiers:
- Tier 1: These platforms represent the majority of Node.js users. The Node.js Build Working Group maintains infrastructure for full test coverage. Test failures on tier 1 platforms will block releases.
- Tier 2: These platforms represent smaller segments of the Node.js user base. The Node.js Build Working Group maintains infrastructure for full test coverage. Test failures on tier 2 platforms will block releases. Infrastructure issues may delay the release of binaries for these platforms.
- Experimental: May not compile or test suite may not pass. The core team does not create releases for these platforms. Test failures on experimental platforms do not block releases. Contributions to improve support for these platforms are welcome.
V8 version 10.1
The V8 engine has been updated to version 10.1 as part of Chromium 101. The following new features are added in Node.js 17.9.0 over the previous version:
findLast() & findLastIndex():
With the findLast() and findLastIndex() methods, This use case is easily and ergonomically solved.They perform identically to their find() and findIndex() equivalents, with the exception that they begin their search at the end of the Array or TypedArray.
Example:

Pic courtesy: medium.com
- Improvements to the Intl.Locale API.
- The Intl.supportedValuesOf function.
- Improved performance of class fields and private class methods (the initialization of them is now as fast as ordinary property stores).
For more information and to develop web applications using Node JS, Hire Node Developer from us as we give you a high-quality product by utilizing all the latest tools and advanced technology. E-mail us any clock at – hello@hkinfosoft.com or Skype us: “hkinfosoft”.
To develop your custom web app using Node JS, please visit our technology page.
Content Source:
- medium.com
Laravel 9: a deep dive into the latest major release
This transition is intended to ease the maintenance burden on the community and challenge our development team to ship amazing, powerful new features without introducing breaking changes. Therefore, we have shipped a variety of robust features to Laravel 8 without breaking backwards compatibility, such as parallel testing support, improved Breeze starter kits, HTTP client improvements, and even new Eloquent relationship types such as “has one of many”.
Therefore, this commitment to ship great new features during the current release will likely lead to future “major” releases being primarily used for “maintenance” tasks such as upgrading upstream dependencies, which can be seen in these release notes.
Laravel 9 continues the improvements made in Laravel 8.x by introducing support for Symfony 6.0 components, Symfony Mailer, Flysystem 3.0, improved route:list output, a Laravel Scout database driver, new Eloquent accessor / mutator syntax, implicit route bindings via Enums, and a variety of other bug fixes and usability improvements.
PHP 8.0
Laravel 9.x requires a minimum PHP version of 8.0.
Flysystem 3.x
Laravel 9.x upgrades our upstream Flysystem dependency to Flysystem 3.x. Flysystem powers all of filesystem interactions offered by the Storage facade.
Please review the upgrade guide to learn more about ensuring your application is compatible with Flysystem 3.x.
Improved Eloquent Accessors / Mutators
Laravel 9.x offers a new way to define Eloquent accessors and mutators. In previous releases of Laravel, the only way to define accessors and mutators was by defining prefixed methods on your model like so:
public function getNameAttribute($value) { return strtoupper($value); } public function setNameAttribute($value) { $this->attributes['name'] = $value; }
However, in Laravel 9.x you may define an accessor and mutator using a single, non-prefixed method by type-hinting a return type of Illuminate\Database\Eloquent\Casts\Attribute:
use Illuminate\Database\Eloquent\Casts\Attribute; public function name(): Attribute { return new Attribute( get: fn ($value) => strtoupper($value), set: fn ($value) => $value, ); }
In addition, this new approach to defining accessors will cache object values that are returned by the attribute, just like custom cast classes:
use App\Support\Address; use Illuminate\Database\Eloquent\Casts\Attribute; public function address(): Attribute { return new Attribute( get: fn ($value, $attributes) => new Address( $attributes['address_line_one'], $attributes['address_line_two'], ), set: fn (Address $value) => [ 'address_line_one' => $value->lineOne, 'address_line_two' => $value->lineTwo, ], ); }
Enum Eloquent Attribute Casting
Enum casting is only available for PHP 8.1+.
Eloquent now allows you to cast your attribute values to PHP “backed” enums. To accomplish this, you may specify the attribute and enum you wish to cast in your model’s $casts property array:
use App\Enums\ServerStatus; /** * The attributes that should be cast. * * @var array */ protected $casts = [ 'status' => ServerStatus::class, ];
Once you have defined the cast on your model, the specified attribute will be automatically cast to and from an enum when you interact with the attribute:
if ($server->status == ServerStatus::provisioned) { $server->status = ServerStatus::ready; $server->save(); }
Implicit Route Bindings With Enums
PHP 8.1 introduces support for Enums. Laravel 9.x introduces the ability to type-hint an Enum on your route definition and Laravel will only invoke the route if that route segment is a valid Enum value in the URI. Otherwise, an HTTP 404 response will be returned automatically. For example, given the following Enum:
enum Category: string { case Fruits = 'fruits'; case People = 'people'; }
You may define a route that will only be invoked if the {category} route segment is fruits or people. Otherwise, an HTTP 404 response will be returned:
Route::get('/categories/{category}', function (Category $category) { return $category->value; });
Forced Scoping Of Route Bindings
In previous releases of Laravel, you may wish to scope the second Eloquent model in a route definition such that it must be a child of the previous Eloquent model. For example, consider this route definition that retrieves a blog post by slug for a specific user:
use App\Models\Post; use App\Models\User; Route::get('/users/{user}/posts/{post:slug}', function (User $user, Post $post) { return $post; });
When using a custom keyed implicit binding as a nested route parameter, Laravel will automatically scope the query to retrieve the nested model by its parent using conventions to guess the relationship name on the parent. However, this behavior was only previously supported by Laravel when a custom key was used for the child route binding.
However, in Laravel 9.x, you may now instruct Laravel to scope “child” bindings even when a custom key is not provided. To do so, you may invoke the scopeBindings method when defining your route:
use App\Models\Post; use App\Models\User; Route::get('/users/{user}/posts/{post}', function (User $user, Post $post) { return $post; })->scopeBindings();
Or, you may instruct an entire group of route definitions to use scoped bindings:
Route::scopeBindings()->group(function () { Route::get('/users/{user}/posts/{post}', function (User $user, Post $post) { return $post; }); });
Controller Route Groups
You may now use the controller method to define the common controller for all of the routes within the group. Then, when defining the routes, you only need to provide the controller method that they invoke:
use App\Http\Controllers\OrderController; Route::controller(OrderController::class)->group(function () { Route::get('/orders/{id}', 'show'); Route::post('/orders', 'store'); });
Full Text Indexes / Where Clauses
When using MySQL or PostgreSQL, the fullText method may now be added to column definitions to generate full text indexes:
$table->text('bio')->fullText();
In addition, the whereFullText and orWhereFullText methods may be used to add full text “where” clauses to a query for columns that have full text indexes. These methods will be transformed into the appropriate SQL for the underlying database system by Laravel. For example, a MATCH AGAINST clause will be generated for applications utilizing MySQL:
$users = DB::table('users') ->whereFullText('bio', 'web developer') ->get();
Laravel Scout Database Engine
If your application interacts with small to medium sized databases or has a light workload, you may now use Scout’s “database” engine instead of a dedicated search service such as Algolia or MeiliSearch. The database engine will use “where like” clauses and full text indexes when filtering results from your existing database to determine the applicable search results for your query.
Rendering Inline Blade Templates
Sometimes you may need to transform a raw Blade template string into valid HTML. You may accomplish this using the render method provided by the Blade facade. The render method accepts the Blade template string and an optional array of data to provide to the template:
use Illuminate\Support\Facades\Blade; return Blade::render('Hello, {{ $name }}', ['name' => 'Julian Bashir']);
Similarly, the renderComponent method may be used to render a given class component by passing the component instance to the method:
use App\View\Components\HelloComponent; return Blade::renderComponent(new HelloComponent('Julian Bashir'));
Slot Name Shortcut
In previous releases of Laravel, slot names were provided using a name attribute on the x-slot tag:
<x-alert> <x-slot name="title"> Server Error </x-slot> <strong>Whoops!</strong> Something went wrong! </x-alert>
However, beginning in Laravel 9.x, you may specify the slot’s name using a convenient, shorter syntax:
<x-slot:title> Server Error </x-slot>
Checked / Selected Blade Directives
For convenience, you may now use the @checked directive to easily indicate if a given HTML checkbox input is “checked”. This directive will echo checked if the provided condition evaluates to true:
<input type="checkbox" name="active" value="active" @checked(old('active', $user->active)) />
Likewise, the @selected directive may be used to indicate if a given select option should be “selected”:
<select name="version"> @foreach ($product->versions as $version) <option value="{{ $version }}" @selected(old('version') == $version)> {{ $version }} </option> @endforeach </select>
Bootstrap 5 Pagination Views
Laravel now includes pagination views built using Bootstrap 5. To use these views instead of the default Tailwind views, you may call the paginator’s useBootstrapFive method within the boot method of your App\Providers\AppServiceProvider class:
use Illuminate\Pagination\Paginator; /** * Bootstrap any application services. * * @return void */ public function boot() { Paginator::useBootstrapFive(); }
For more information and to develop web applications using Laravel, Hire Laravel Developer from us as we give you a high-quality product by utilizing all the latest tools and advanced technology. E-mail us any clock at – hello@hkinfosoft.com or Skype us: “hkinfosoft”.
To develop custom web apps using Laravel, please visit our technology page.
Content Source:
- laravel.com
How to build a plugin system with Node.js
If you’re building a minimalist product, but want to add an extra level of customisation or expansion, then plugins are a great way to enable that. They allow each instance to be customised with additional code that anyone can create.
In this tutorial, we will extend a minimal web server with a plugin system, create a plugin, and use that plugin with the server. We’ve already created a very basic web server. We’ll use this as our starter code.
What Is A Plugin
But before we start, what is a plugin? A plugin is an extra piece of code that expands what a program can normally do. Web extensions are a good example of this. They leverage existing APIs, and build new functionality upon them. In certain circumstances, they can also be used to patch bugs and flaws that the main software provider has not yet addressed.
In this tutorial, we’ll be building a request counting plugin, and the system to load and unload that plugin.
Getting Started
Let’s start by creating a fresh Node.js project. I’ll be using NPM in this tutorial, but I’ll add the Yarn commands too.
mkdir my-plugin-app
cd my-plugin-app
npm init -y (yarn init -y)
npm i express (yarn add express)
We won’t go through the process of creating an app for our plugins system. Instead, we’ll use this starter code. A simple server with one endpoint. Create an index.js file and add this starter code.
const express = require('express'); const EventEmitter = require('events'); class App extends EventEmitter { constructor() { super(); this.server = express(); this.server.use(express.json()); } start() { this.server.get('/', (req, res) => { res.send('Hello World!'); }); this.server.listen(8080, () => { console.log('Server started on port 3000') this.emit('start'); }); } stop() { if (this.stopped) return; console.log('Server stopped'); this.emit('stop'); this.stopped = true; process.exit(); } } const app = new App(); app.start(); ["exit", "SIGINT", "SIGUSR1", "SIGUSR2", "SIGTERM", "uncaughtException"].forEach(event => { process.on(event, () => app.stop()); });
Our First Plugin
To get a sense as to what our plugin system will do, let’s create a little plugin containing everything a plugin will need. The type of plugin system we’ll be implementing has two events. load and unload are the two times we will directly call any code in the plugin. load sets up any extra routes, middleware or anything else that is part of the plugin, and unload tells us to safely stop whatever we’re doing and save any persistent data.
This plugin will set up a middleware to count the number of requests we get. In addition, it will add an API route so we can query the number of requests that have been made so far. The plugin will export two different functions, one for each event.
const fs = require('fs'); let count = 0; function load(app) { try { count = +fs.readFileSync('./counter.txt'); } catch (e) { console.log('counter.txt not found. Starting from 0'); } app.server.use((req, res, next) => { count++; next(); }); app.server.get('/count', (req, res) => { res.send({ count }); }) } // Save request count for next time function unload(app) { fs.writeFileSync('./counter.txt', count); } module.exports = { load, unload };
Loading Plugins
Our plugin system will all be kept in a separate class from the main app, we’ll put this in a new file plugins.js. The point of this class is to load and unload plugins.
The load function takes a path to a plugin file, and uses the require method to load it during runtime. The loadFromConfig method allows us to load plugins defined in a config file.
const fs = require("fs"); class Plugins { constructor(app) { super(); this.app = app; this.plugins = {}; } async loadFromConfig(path='./plugins.json') { const plugins = JSON.parse(fs.readFileSync(path)).plugins; for (let plugin in plugins) { if (plugins[plugin].enabled) { this.load(plugin); } } } async load(plugin) { const path = plugins[plugin]; try { const module = require(path); this.plugins[plugin] = module; await this.plugins[plugin].load(this.app); console.log(`Loaded plugin: '${plugin}'`); } catch (e) { console.log(`Failed to load '${plugin}'`) this.app.stop(); } } } module.exports = Plugins;
We’ll use a plugins.json file to store the paths to all the plugins we wish to load, then call the loadFromConfig method to load them all at once. Put the plugins.json file in the same directory as your code.
{ "counter": "./counter.js" }
Finally, we’ll create an instance of the plugin in our app. Import the Plugins class, create an instance in the constructor, and call loadFromConfig leaving the path blank (the default is ./plugins.json).
const express = require('express'); const Plugins = require('./plugins'); class App { constructor() { super(); this.plugins = new Plugins(this); this.server = express(); this.server.use(express.json()); } async start() { await this.plugins.load(); this.server.get('/', (req, res) => { res.send('Hello World!'); }); this.server.listen(8080, () => { console.log('Server started on port 3000') }); } stop() { if (this.stopped) return; console.log('Server stopped'); this.stopped = true; process.exit(); } } const app = new App(); app.start(); ["exit", "SIGINT", "SIGUSR1", "SIGUSR2", "SIGTERM", "uncaughtException"].forEach(event => { process.on(event, () => app.stop()); });
Unloading Plugins
We now need to handle the unload method exported from our plugin. And once we do, we need to remove it from the plugins collection. We’ll also include a stop method which will unload all plugins. We’ll use this method later to enable safe shutdowns.
const fs = require("fs"); class Plugins { constructor(app) { super(); this.app = app; this.plugins = {}; } async loadFromConfig(path='./plugins.json') { const plugins = JSON.parse(fs.readFileSync(path)).plugins; for (let plugin in plugins) { if (plugins[plugin].enabled) { this.load(plugin); } } } async load(plugin) { const path = plugins[plugin]; try { const module = require(path); this.plugins[plugin] = module; await this.plugins[plugin].load(this.app); console.log(`Loaded plugin: '${plugin}'`); } catch (e) { console.log(`Failed to load '${plugin}'`) this.app.stop(); } } unload(plugin) { if (this.plugins[plugin]) { this.plugins[plugin].unload(); delete this.plugins[plugin]; console.log(`Unloaded plugin: '${plugin}'`); } } stop() { for (let plugin in this.plugins) { this.unload(plugin); } } } module.exports = Plugins;
Safe Shutdown
To make sure that plugins get a chance to unload when the app closes, we need to call Plugins.stop. In the index.js code, we included a stop method that gets called when the app is killed, and we’ve just added a stop method to the Plugins class. So let’s call the Plugins.stop method when our app stop method is called.
Add the following to the App.stop method.
stop() { if (this.stopped) return; + this.plugins.stop(); console.log('Server stopped'); this.stopped = true; process.exit(); }
For more information and to develop web applications using Node JS, Hire Node Developer from us as we give you a high-quality product by utilizing all the latest tools and advanced technology. E-mail us any clock at – hello@hkinfosoft.com or Skype us: “hkinfosoft”.
To develop your custom web app using Node JS, please visit our technology page.
Content Source:
- javascript.plainenglish.io