Our Global Presence

Canada
57 Sherway St,
Stoney Creek, ON
L8J 0J3

India
606, Suvas Scala,
S P Ring Road, Nikol,
Ahmedabad 380049

USA
1131 Baycrest Drive,
Wesley Chapel,
FL 33544
What should we learn next? If you are a developer, this question should always be in your mind. Every day, new technologies are introduced and improvements are made to existing ones. Since we can’t learn all these technologies, it is really important to decide what should we learn next.
In this article, we’re going to discuss three back end development frameworks based on three different programming languages to give you an insight into what you should learn in 2021.
Pic courtesy: medium.com; Source: nodejs.org
NodeJS is a JavaScript runtime environment framework that can be used for cross-platform development purposes. Since JavaScript is one of the most popular languages in the current context, that popularity has lifted up NodeJS to be one of the most used back end frameworks as well. Apart from that, NodeJS brings many important features that attract developers.
NodeJS is used by some famous companies all around the world including eBay, General Electric, GoDaddy, Microsoft, PayPal, Uber, Wikipins. Node JS is a perfect match if you’re building I/O bound applications, data streaming applications, Data Intensive Real-time Applications (DIRT), JSON APIs based applications, or single-page applications.
Advantages
Disadvantages
Pic courtesy: medium.com; Source: .djangoproject.com
Django is an open-source, high-level web application framework written in Python. Django was introduced in 2005, and its idea of using Python for web development was a huge revolution. Django follows the model-template-view architecture and the main focus of this framework is to provide an easy method for the development of complex websites. Instagram, Mozilla, Bitbucket are some leading companies that use Django as their framework.
Advantages
Disadvantages
Pic courtesy: medium.com; Source: laravel.com
PHP is another famous language among web developers and Laravel is based on PHP. Laravel follows model-view-control architecture and is robust and easy to understand. Laravel is known as a good starting point for young developers. It provides a large set of features, like flexible routing for easy scaling, configuration management to handle different environments, query builders and ORM to query databases, Schema Builder to maintain database definitions and schemas, lightweight templates, etc. 9GAG, MasterCard, Kmong are some famous companies that use Laravel in their products.
Advantages
Disadvantages
Pic courtesy: medium.com
As you can see, all these three frameworks are very popular among developers and they tend to select the framework based on their preferred language most of the time. For example, If you’re good with JavaScript, you will surely go with NodeJS. But there are other aspects we should take into account when selecting a framework.
If you’re a novice developer, who doesn’t have knowledge about JavaScript, Python, or PHP, Django or Python will be a good option for you to start with. Since Python is very straightforward and simple in its syntax, you can easily understand it. So, we will rank Django at the top when it comes to the learning curve while Laravel and NodeJS come next.
Security is another measurement we need to address in any project and all these frameworks provide built-in features to make a developer’s life easy. Out of these three, Django claims first place here as well.
If we talk about scalability and performance Django can be recognized as the best framework in scalability aspects while NodeJS provides the best performance.
All these frameworks have large communities and good documentation to get started with and they’re well established. So don’t hesitate to select them for your projects.
For more information and to develop web application using NodeJS, Hire Back End Developer from us as we give you a high-quality product by utilizing all the latest tools and advanced technology. E-mail us any clock at – hello@hkinfosoft.com or Skype us: “hkinfosoft”. To develop custom web apps using NodeJS, please visit our technology page.
Content Source:
PHP 8 is here! PHP 8 is a new major improved version, which means that it will introduce performance improvements and lots of new features such as the JIT compiler, union types, attributes, and more.
Let’s start with all new features, it’s quite a list!
Given the dynamically typed nature of PHP, there are lots of cases where union types can be useful. Union types are a collection of two or more types which indicate that either one of those can be used.
public function foo(Foo|Bar $input): int|float;
Note that void can never be part of a union type, since it indicates “no return value at all”. Furthermore, nullable unions can be written using |null, or by using the existing ? notation:
public function foo(Foo|null $foo): void; public function bar(?Bar $bar): void;
The JIT — just in time — compiler promises significant performance improvements, albeit not always within the context of web requests. I’ve done my own benchmarks on real-life web applications, and it seems like the JIT doesn’t make that much of a difference, if any, on those kinds of PHP projects.
If you’re familiar with the null coalescing operator you’re already familiar with its shortcomings: it doesn’t work on method calls. Instead you need intermediate checks, or rely on optional helpers provided by some frameworks:
$startDate = $booking->getStartDate(); $dateAsString = $startDate ? $startDate->asDateTimeString() : null;
With the addition of the nullsafe operator, we can now have null coalescing-like behaviour on methods!
$dateAsString = $booking->getStartDate()?->asDateTimeString();
Named arguments allow you to pass in values to a function, by specifying the value name, so that you don’t have to take their order into consideration, and you can also skip optional parameters!
function foo(string $a, string $b, ?string $c = null, ?string $d = null) { /* … */ } foo( b: 'value b', a: 'value a', d: 'value d', );
Attributes, commonly known as annotations in other languages, offers a way to add meta data to classes, without having to parse docblocks.
As for a quick look, here’s an example of what attributes look like, from the RFC:
use App\Attributes\ExampleAttribute; #[ExampleAttribute] class Foo { #[ExampleAttribute] public const FOO = 'foo'; #[ExampleAttribute] public $x; #[ExampleAttribute] public function foo(#[ExampleAttribute] $bar) { } }
#[Attribute] class ExampleAttribute { public $value; public function __construct($value) { $this->value = $value; } }
Note that this base Attribute used to be called PhpAttribute in the original RFC, but was changed with another RFC afterwards. If you want to take a deep dive in how attributes work, and how you can build your own; you can read about attributes in depth on this blog.
You could call it the big brother of the switch expression: match can return values, doesn’t require break statements, can combine conditions, uses strict type comparisons and doesn’t do any type coercion.
It looks like this:
$result = match($input) { 0 => "hello", '1', '2', '3' => "world", };
This RFC adds syntactic sugar to create value objects or data transfer objects. Instead of specifying class properties and a constructor for them, PHP can now combine them into one.
Instead of doing this:
class Money { public Currency $currency; public int $amount; public function __construct( Currency $currency, int $amount, ) { $this->currency = $currency; $this->amount = $amount; } }
You can now do this:
class Money { public function __construct( public Currency $currency, public int $amount, ) {} }
While it was already possible to return self, static wasn’t a valid return type until PHP 8. Given PHP’s dynamically typed nature, it’s a feature that will be useful to many developers.
class Foo { public function test(): static { return new static(); } }
Some might call it a necessary evil: the mixed type causes many to have mixed feelings. There’s a very good argument to make for it though: a missing type can mean lots of things in PHP:
Because of the reasons above, it’s a good thing the mixed type is added. mixed itself means one of these types:
Note that mixed can also be used as a parameter or property type, not just as a return type.
Also note that since mixed already includes null, it’s not allowed to make it nullable. The following will trigger an error:
// Fatal error: Mixed types cannot be nullable, null is already part of the mixed type. function bar(): ?mixed {}
This RFC changes throw from being a statement to being an expression, which makes it possible to throw exception in many new places:
$triggerError = fn () => throw new MyError(); $foo = $bar['offset'] ?? throw new OffsetDoesNotExist('offset');
Previously, PHP used to apply the same inheritance checks on public, protected and private methods. In other words: private methods should follow the same method signature rules as protected and public methods. This doesn’t make sense, since private methods won’t be accessible by child classes.
This RFC changed that behaviour, so that these inheritance checks are not performed on private methods anymore. Furthermore, the use of final private function also didn’t make sense, so doing so will now trigger a warning:
Warning: Private methods cannot be final as they are never overridden by other classes
A small, yet useful, new feature: it’s now possible to use ::class on objects, instead of having to use get_class() on them. It works the same way as get_class().
$foo = new Foo(); var_dump($foo::class);
Whenever you wanted to catch an exception before PHP 8, you had to store it in a variable, regardless whether you used that variable or not. With non-capturing catches, you can omit the variable, so instead of this:
try { // Something goes wrong } catch (MySpecialException $exception) { Log::error("Something went wrong"); }
You can now do this:
try { // Something goes wrong } catch (MySpecialException) { Log::error("Something went wrong"); }
Note that it’s required to always specify the type, you’re not allowed to have an empty catch. If you want to catch all exceptions and errors, you can use Throwable as the catching type.
Already possible when calling a function, trailing comma support was still lacking in parameter lists. It’s now allowed in PHP 8, meaning you can do the following:
public function( string $parameterA, int $parameterB, Foo $objectfoo, ) { // … }
You can already create a DateTime object from a DateTimeImmutable object using DateTime::createFromImmutable($immutableDateTime), but the other way around was tricky. By adding DateTime::createFromInterface() and DatetimeImmutable::createFromInterface() there’s now a generalised way to convert DateTime and DateTimeImmutable objects to each other.
DateTime::createFromInterface(DateTimeInterface $other); DateTimeImmutable::createFromInterface(DateTimeInterface $other);
The Stringable interface can be used to type hint anything that implements __toString(). Whenever a class implements __toString(), it automatically implements the interface behind the scenes and there’s no need to manually implement it.
class Foo { public function __toString(): string { return 'foo'; } } function bar(string|Stringable $stringable) { /* … */ } bar(new Foo()); bar('abc');
Some might say it’s long overdue, but we finally don’t have to rely on strpos() anymore to know whether a string contains another string.
Instead of doing this:
if (strpos('string with lots of words', 'words') !== false) { /* … */ }
You can now do this
if (str_contains('string with lots of words', 'words')) { /* … */ }
Two other ones long overdue, these two functions are now added in the core.
str_starts_with('haystack', 'hay'); // true str_ends_with('haystack', 'stack'); // true
The new fdiv() function does something similar as the fmod() and intdiv() functions, which allows for division by 0. Instead of errors you’ll get INF, -INF or NAN, depending on the case.
get_debug_type() returns the type of a variable. Sounds like something gettype() would do? get_debug_type() returns more useful output for arrays, strings, anonymous classes and objects.
For example, calling gettype() on a class \Foo\Bar would return object. Using get_debug_type() will return the class name.
A full list of differences between get_debug_type() and gettype() can be found in the RFC.
Resources are special variables in PHP, referring to external resources. One example is a MySQL connection, another one a file handle.
Each one of those resources gets assigned an ID, though previously the only way to know that id was to cast the resource to int:
$resourceId = (int) $resource;
PHP 8 adds the get_resource_id() functions, making this operation more obvious and type-safe:
$resourceId = get_resource_id($resource);
Traits can specify abstract methods which must be implemented by the classes using them. There’s a caveat though: before PHP 8 the signature of these method implementations weren’t validated. The following was valid:
trait Test { abstract public function test(int $input): int; } class UsesTrait { use Test; public function test($input) { return $input; } }
PHP 8 will perform proper method signature validation when using a trait and implementing its abstract methods. This means you’ll need to write this instead:
class UsesTrait { use Test; public function test(int $input): int { return $input; } }
The token_get_all() function returns an array of values. This RFC adds a PhpToken class with a PhpToken::tokenize() method. This implementation works with objects instead of plain values. It consumes less memory and is easier to read.
From the RFC: “the Uniform Variable Syntax RFC resolved a number of inconsistencies in PHP’s variable syntax. This RFC intends to address a small handful of cases that were overlooked.”
Lots of people pitched in to add proper type annotations to all internal functions. This was a long standing issue, and finally solvable with all the changes made to PHP in previous versions. This means that internal functions and methods will have complete type information in reflection.
Previously it was possible to compile PHP without the JSON extension enabled, this is not possible anymore. Since JSON is so widely used, it’s best developers can always rely on it being there, instead of having to ensure the extension exist first.
For more information and to develop web application using PHP, WordPress OR Laravel, Hire PHP Developer from us as we give you a high-quality product by utilizing all the latest tools and advanced technology. E-mail us any clock at – hello@hkinfosoft.com or Skype us: “hkinfosoft”. To develop custom web apps using PHP, Laravel or WordPress, please visit our technology page.
Content Source:
You know those tedious tasks you have to do at work: Updating configuration files, copying and pasting files, updating Jira tickets.
The definition of a reskin at the company was using the same game mechanics, screens and positioning of elements, but changing the visual aesthetics such as color and assets. So in the context of a simple game like ‘Rock Paper Scissors’, we would create a template with basic assets like below.
Pic courtesy: medium.com
But when we create a reskin of this, we would use different assets and the game would still work. If you look at games like Candy Crush or Angry Birds, you’ll find that they have many varieties of the same game. Usually Halloween, Christmas or Easter releases. From a business perspective it makes perfect sense. Now… back to our implementation. Each of our games would share the same bundled JavaScript file, and load in a JSON file that had different content and asset paths.
We had stacked daily schedules, and our first thought was, ‘a lot of this could be automated.’ Whenever we created a new game, we had to carry out these steps:
Now to us, this felt more administrative than actual development work. We were exposed to Bash scripting in a previous role and jumped on it to create a few scripts to reduce the effort involved. One of the scripts updated the templates and created a new branch, the other script did a commit and merged the project to Staging and Production environments.
Setting up a project would take us three-to-ten minutes to set up manually. Maybe five to ten minutes for deployment. Depending on the complexity of the game, it could take anything from ten minutes to half a day. The scripts helped, but a lot of time was still spent on updating the content or trying to chase down missing information.
Writing code to save time was not enough. We were thinking of a better approach to our workflow so that we could utilize the scripts more. Move the content from out of the word documents, and into Jira tickets, breaking it out into the relevant custom fields. The Designers, instead of sending a link to where the assets exist on the public drive, it would be more practical to set up a content delivery network (CDN) repository with a Staging and Production URL to the assets.
Things like this can take a while to enforce, but our process did improve over time. we did some research on the API of Jira, our project management tool, and did some requests to the Jira tickets we were working on. We were pulling back a lot of valuable data. So valuable that we made the decision to integrate it into our Bash scripts to read values from Jira tickets, to also post comments and tag stakeholders when we finished.
The Bash scripts were good, but if someone was working on a Windows machine, they couldn’t be run. After doing some digging, we made the decision to use JavaScript to wrap the
whole process into a bespoke build tool. we called the tool Mason, and it would change everything.
When you use Git, we assume you do in the terminal, you will notice it has a very friendly command line interface. If you misspell or type a command incorrectly, it will politely make a suggestion on what it thinks you were trying to type. A library called commander applies the same behavior, and this was one of many libraries we used.
Consider the simplified code example below. It’s bootstrapping a Command Line Interface (CLI) application.
#! /usr/bin/env node const mason = require('commander'); const { version } = require('./package.json'); const console = require('console'); // commands const create = require('./commands/create'); const setup = require('./commands/setup'); mason .version(version); mason .command('setup [env]') .description('run setup commands for all envs') .action(setup); mason .command('create <ticketId>') .description('creates a new game') .action(create); mason .command('*') .action(() => { mason.help(); }); mason.parse(process.argv); if (!mason.args.length) { mason.help(); }
With the use of npm, you can run a link from your package.json and it will create a global alias.
... "bin": { "mason": "src/mason.js" }, ...
When we run npm link in the root of the project.
npm link
It will provide us with a command we can call, called mason. So whenever we call mason in our terminal, it will run that mason.js script. All tasks fall under one umbrella command called mason, and we used it to build games every day. The time we saved was… incredible.
You can see below in a hypothetical example of what we did back then, that we pass a Jira ticket number to the command as an argument. This would curl the Jira API, and fetch all the information we needed to update the game. It would then proceed to build and deploy the project. we would then post a comment and tag the stakeholder & designer to let them know it was done.
$ mason create GS-234 ... calling Jira API ... OK! got values! ... creating a new branch from master called 'GS-234' ... updating templates repository ... copying from template 'pick-from-three' ... injecting values into config JSON ... building project ... deploying game ... Perfect! Here is the live link http://www.fake-studio.com/game/fire-water-earth ... Posted comment 'Hey [~ben.smith], this has been released. Does the design look okay? [~jamie.lane]' on Jira.
All done with a few key strokes!
We were so happy with the whole project.
The first part is a collection of recipes, or instructional building blocks that behave as individual global commands. These can be used as you go about your day, and can be called at any time to speed up your workflow or for pure convenience.
The second part is a walk-through of creating a cross-platform build tool from the ground up. Each script that achieves a certain task will be its own command, with a main umbrella command usually the name of your project encapsulating them all.
We understand that circumstances and flows are different in every business, but you should be able to find something, even if it’s small, that can make your day a little easier at the office.
For more information and to develop web application using Node JS, Hire Node Developer from us as we give you a high-quality product by utilizing all the latest tools and advanced technology. E-mail us any clock at – hello@hkinfosoft.com or Skype us: “hkinfosoft”. To develop custom web apps using Node JS, please visit our Hire Node Developer technology page.
Content Source:
Media Library can associate files with Eloquent models. You can, for instance, associate images with a blog post model. In your Blade view, you can retrieve URLs to the associated images. It can handle multiple collections, work with multiple filesytems, create zips on the fly to download multiple files, use a customized directory structure, save bandwidth using responsive images and much more.
Before exploring Media Library Pro, let’s first explained why we built it in the first place. Here’s how a traditional upload form might look like. It uses a regular input of type file.
Pic courtesy: laravel-news.com
<form method="POST" enctype="multipart/form-data"> <x-grid> @csrf <x-field label="name"> <x-input id="name" name="name" placeholder="Your first name" /> </x-field> <x-field label="file"> <input type="file" name="file"> @error('file') {{ $message }} @enderror </x-field> <x-button dusk="submit">Submit</x-button> </x-grid> </form>
There are two big problems with this standard upload element.
First, the upload process only starts when the form is submitted. For small files in small forms, this might not be a problem. But imagine you’re uploading a multi MB file in a form. When submitting the form, you now have to wait for the upload to complete before seeing the submission results.
The second problem is something that has bothered us for a long, long time. Imagine that the input field is part of a form of which some fields are required. You’re selecting a file, submitting the form, leaving some of the required fields empty. You get redirected back to the form where error messages are now displayed. Your previous file selection is gone, and you need to select the file again.
Media Library Pro is a paid add-on package that offers Blade, Vue, and React components to upload files to your application. It ships with two components. The first one is the attachment component. It is meant to be used on a public-facing page where you want users to upload one or multiple files.
Pic courtesy: laravel-news.com
The second one is called the collection component. This one can manage the files in an existing collection. It is meant to be used in the admin part of your app.
Pic courtesy: laravel-news.com
Both of these components are available as Vue, React and Blade components. Under the hood, the Blade components are powered by Caleb’s excellent Livewire package.
These components are straightforward to install and are documented in great detail.
Let’s take a look at both the `Attachment` and `Collection` component. In the remainder of the blog post, we’ll use the Blade version of the examples, but rest assured that everything shown can also be done with the Vue and React counterparts.
To get started with the Attachment Blade component you’ll have to use x-media-library-attachment in your view.
<form method="POST"> @csrf <input id="name" name="name"> <x-media-library-attachment name="avatar"/> <button type="submit">Submit</button> </form>
Here’s how it looks like after we’ve selected a file but before submitting the form.
Pic courtesy: laravel-news.com
The x-media-library-attachment has taken care of the upload. The file is now stored as a temporary upload. In case there are validation errors when submitting the form, the x-media-library-attachment will display the temporary upload when you get redirected back to the form. There’s no need for the user to upload the file again.
Pic courtesy: laravel-news.com
Here’s the form request used to validate the uploaded.
namespace App\Http\Requests\Blade; use Illuminate\Foundation\Http\FormRequest; use Spatie\MediaLibraryPro\Rules\Concerns\ValidatesMedia; class StoreBladeAttachmentRequest extends FormRequest { use ValidatesMedia; public function rules() { return [ 'name' => 'required', 'media' => ['required', $this->validateSingleMedia() ->maxItemSizeInKb(3000), ], ]; } }
By applying the ValidatesMedia trait, you get access to the validateSingleMedia, which allows you to validate the upload. You can chain on many validation methods, which are documented here.
In your controller, you can associate the upload file to any model you’d like.
$formSubmission ->addFromMediaLibraryRequest($request->media) ->toMediaCollection('images');
And that is all you need to do!
The attachment component can be used to handle multiple uploads as well. In this video, you’ll see how that is done.
You can manage the entire contents of a media library collection with x-media-library-collection component. This
component is intended to use in admin sections.
Here is an example where we will administer an images collection of a $formSubmission model.
<form method="POST"> @csrf <x-field label="name"> <x-input id="name" name="name" autocomplete="off" placeholder="Your name" value="{{ old('name', $formSubmission->name) }}"/> </x-field> <x-field label="Images"> <x-media-library-collection name="images" :model="$formSubmission" collection="images" max-items="3" rules="mimes:png,jpeg" /> </x-field> <x-button dusk="submit" type="submit">Submit</x-button> </form>
Here’s how that component looks like:
Pic courtesy: laravel-news.com
This component will display the contents of the entire collection. Files can be added, removed, updated, and reordered.
To validate the response of the form, a form request like this one can be used:
namespace App\Http\Requests\Blade; use Illuminate\Foundation\Http\FormRequest; use Spatie\MediaLibraryPro\Rules\Concerns\ValidatesMedia; class StoreBladeCollectionRequest extends FormRequest { use ValidatesMedia; public function rules() { return [ 'name' => 'required', 'images' => [$this->validateMultipleMedia() ->maxItems(3) ->itemName('required'), ], ]; } }
Again, you need to ValidatesMedia trait. This time the validateMultipleMedia should be used. You can chain on the other validation methods, which are documented here.
In the controller, you can associate the media in the collection component with your model using the syncFromMediaLibraryRequest method.
Here’s the relevant code in the controller of the demo app.
$formSubmission ->syncFromMediaLibraryRequest($request->images) ->toMediaCollection('images');
When using the collection component, you probably want to add some extra fields to be displayed. We’ve made this a straightforward thing to do.
In the screenshot below, we added the Extra field field.
Pic courtesy: medium.com
You can achieve this by passing a blade view to the fields-view prop of the x-media-library-collection.
<x-media-library-collection name="images" :model="$formSubmission" collection="images" max-items="3" rules="mimes:png,jpeg" fields-view="uploads.blade.partials.custom-properties" />
In that custom-properties view, you can put anything that should be displayed in the right half of the collection component.
Here’s the content of that custom-propertiesview.
@include('media-library::livewire.partials.collection.fields') <div class="media-library-field"> <label class="media-library-label">Extra field</label> <input dusk="media-library-extra-field" class="media-library-input" type="text" {{ $mediaItem->customPropertyAttributes('extra_field') }} /> @error($mediaItem->customPropertyErrorName('extra_field')) <span class="media-library-text-error"> {{ $message }} </span> @enderror </div>
In the form request, you can use the customProperty to validate any extra custom attributes. The second argument of the function can take any validator in Laravel.
namespace App\Http\Requests\Blade; use Illuminate\Foundation\Http\FormRequest; use Spatie\MediaLibraryPro\Rules\Concerns\ValidatesMedia; class StoreBladeCollectionCustomPropertyRequest extends FormRequest { use ValidatesMedia; public function rules() { return [ 'name' => 'required', 'images' => [$this->validateMultipleMedia() ->maxItems(3) ->itemName('required|max:30') ->customProperty('extra_field', 'required|max:30'), ], ]; } }
In the controller where you process the form submission, you should use the withCustomProperties method to whitelist any extra attributes that you want to sync with your media.
$formSubmission ->syncFromMediaLibraryRequest($request->images) ->withCustomProperties('extra_field') ->toMediaCollection('images');
Customizing the look and feel
By default, both the Attachment and Collection components already look good. Probably you’d like to adapt them so they match the look of your app.
Luckily, this is easy to do. The styles that ship with Media Library Pro can be used by importing or linking dist/styles.css. The styles were built with a default tailwind.config.js.
You can customize the styles by importing src/styles.css and run every @apply rule through your own tailwind.config.js
/* app.css */ @tailwind base; @tailwind components; @tailwind utilities; @import "src/styles.css"; …
To achieve that behavior where uploaded files are preserved when a form validation error occurs, we use temporary uploads.
Inside the private spatie/laravel-medialibrary-pro repo, there are a lot of tests to make sure the back end integration and the Vue, React, and Blade front end components are working as expected.
We also wanted to have browser tests that ensure that front end components work perfectly with the back end and vice versa. That’s why we added Dusk tests in our demo application. You can see them here.
Let’s take a look at one of them:
/** * @test * * @dataProvider routeNames */ public function it_can_handle_a_single_upload(string $routeName) { $this->browse(function (Browser $browser) use ($routeName) { $browser ->visit(route($routeName)) ->type('name', 'My name') ->attach('@main-uploader', $this->getStubPath('space.png')) ->waitForText('Remove') ->waitUntilMissing('.media-library-progress-wrap.media-library-progress-wrap-loading') ->press('@submit') ->assertSee('Your form has been submitted'); $this->assertCount(1, FormSubmission::get()); $this->assertEquals('space.png', FormSubmission::first()->getFirstMedia('images')->file_name); }); }
This test will upload a file and make sure that the file is associated with a model after the form is submitted.
A thing to note here is that @dataProvider attribute. This will make PHPUnit run the test for each result returned by the routeNames function defined in the same file.
public function routeNames(): array { return [ ['vue.attachment'], ['react.attachment'], ['blade.attachment'], ]; }
You can see that in combination with the routeNames function, the it_can_handle_a_single_upload will run for the vue.attachment, react.attachment and blade.attachment routes. Visiting these routes will display the form that will use the Vue, React, or Blade component, respectively. So, this one test covers a lot of logic. It makes sure that the component work using any technology. This gives us a lot of confidence that all of the components are working correctly.
For more information and to develop web application using Laravel, Hire Laravel Developer from us as we give you a high-quality product by utilizing all the latest tools and advanced technology. E-mail us any clock at – hello@hkinfosoft.com or Skype us: “hkinfosoft”. To develop custom web apps using React JS, please visit our technology page.
Content Source:
Node is a leader in the asynchronous framework market. The platform now supports a huge portion of startups and businesses that are earning hundreds of millions of dollars in revenue. Thus, it establishes itself as a platform that can sustain a huge load, whilst retaining smooth performance. Node.js was perhaps the biggest revelation of modern server engineering that we saw. By the looks of it, Node isn’t stopping any time soon; it’s the exact opposite. The project continues to push out frequent updates and maintains old releases to support older platforms. A new release secures some loopholes in OpenSSL, but also adds more support for languages like C and C++.
Starting with Node.js is a fairly easy process; the guidelines are outlined and thousands of projects are sitting on GitHub, waiting for you to inspect and analyze their architecture. Node.js works great on all platforms, even on Windows 10, for those who are interested. That makes it a truly great platform to begin learning front-end and back-end development together. Let’s not forget that Node has the most populated package manager of any framework or language known to man. Thus, building a website takes only a couple of minutes, thanks to the modules and libraries that are available through the package manager(NPM). So let’s get started with the topmost packages of nodeJS.
Pic courtesy: medium.com
All common programming languages share similar structures in the way things are built. One of the fastest ways to get a programming language to serve your needs is through a framework. Express is the leading Node.js framework for quickly creating and publishing applications and APIs. The framework’s minimal structure allows any Node.js developer to quickly launch a functional application with the use of Express Generator. Express gives you a solid outline to build your apps on top of. Combine it with any of the other packages we will discuss, and you will quickly realize just how amazing this framework truly is.
Pic courtesy: medium.com
Node.js is known for being the framework to use for scaling large applications, and infrastructure. Process management should be an essential priority for any Node.js user. PM2 offers both process management for production applications, and a load-balancer to help with any possible performance tweaks. With PM2, your applications stay online indefinitely, giving you the tools to reload apps without having to experience any sort of downtime. Is it a surprise that hundreds of thousands of Node.js users consider this an essential tool to have?
Pic courtesy: medium.com
Even more asynchronous action going on here in this Node.js package roundup, this time we have Mocha – a feature-rich JavaScript test framework running on Node.js and the browser, making asynchronous testing simple and fun. Mocha tests run serially, allowing for flexible and accurate reporting, while mapping uncaught exceptions to the correc t test cases. Testing is so important to understand how well the application is performing, where we can locate any particular leaks, and also to know how we can improve these bugs, problems, and irritations that we experience. Testing lets developers to understand better how their code performs, and in turn learn more skills as they continue down their chosen path.
Pic courtesy: medium.com
A modern JavaScript utility library delivering modularity, performance & extras. Lodash makes JavaScript easier by taking the hassle out of working with arrays, numbers, objects, strings, etc. Lodash’s modular methods are great for:
Pic courtesy: medium.com
ESLint is a static code analysis tool for identifying problematic patterns found in JavaScript code. Rules in ESLint are configurable, and customized rules can be defined and loaded. ESLint covers both code quality and coding style issues. Inshort the goal is to make code more consistent and avoiding bugs. In many ways, it is similar to JSLint and JSHint with a few exceptions:
Pic courtesy: medium.com
Passport is a unique authentication module for Node.js devs. The main goal of Passport is to help with authentication requests, this Passport achieves through the use of third-party plugins that act as authentication methods, otherwise known as strategies. The Passport API is straightforward, you give Passport a request that you need to authenticate, the Passport in turn gives you the hooks that let you control what happens after an authentication call fails, or succeeds. Exploring the Strategies, there are hundreds of authentication methods to choose from, starting from internal ones, all the way up to external ones like Google, Facebook, and others.
One of the most stable & maintained time manipulation libraries you can find. In the whole collection of libraries, created to solve the issues of formatting, parsing, converting and, generally, working with different forms of time, Moment.js is the one that has seen the widest adoption.
import moment from "moment"; // in relation to release date of this post moment().format("MMMM Do YYYY"); // June 6th 2019 moment("20111031", "YYYYMMDD").fromNow(); // 8 years ago moment().subtract(10, "days").calendar(); // 05/27/2019
With its latest v2 release, Moment.js got rewritten, to support latest ES6 syntax. This brings improved modularity and better performance for always-green browsers. Such things are important, especially when dealing with a library as big as Moment.js.
With Chalk, we’re entering the world of terminal-related tools and libraries, where a number of downloads, and thus popularity, go crazy! Chalk is an extremely simple library, created for one, simple purpose – styling your terminal strings! Just like with Require – it proves that the most useful things are also the simplest ones.
import chalk from "chalk"; // string concatenation - template literals console.log(`${chalk.blue("Hello")} World${chalk.red("!")}`); // chainable API console.log(chalk.blue.bgRed.bold("Hello world!"));
Of course, the API is simple, intuitive (chainable) and it works really well with all features that JS has to offer natively. The official page of the packages states that it’s used by more than 20K different packages! Maybe that’s where the weekly downloads count comes from (~25M). Even though, such numbers cannot be ignored.
Socket.io is just happened to get early access to the Nodejs package that allows you to build a truly real-time communication application that would require real-time streams of data content, either directly from the data that you are working with, or through an Application programming interface(API) that comes from another source. Some example apps like Twitter, they deployed a bot for collecting the latest tweets and on Facebook, bot for watching the news, thus with the combinations of APIs to explore some interesting things that work with data in real-time.
Request is designed to be the simplest way possible to make http calls. It supports HTTPS and follows redirects by default. You can also stream a file to a PUT or POST request. Although it’s deprecated still the most useful package for network calls. few usages are listed below,
For more information and to develop web application using Node JS, Hire Node Developer from us as we give you a high-quality product by utilizing all the latest tools and advanced technology. E-mail us any clock at – hello@hkinfosoft.com or Skype us: “hkinfosoft”. To develop custom web apps using Node JS, please visit our Hire Node Developer technology page.
Content Source:
Sanctum is Laravel’s lightweight API authentication package. This tutorial will go over using Laravel Sanctum to authenticate a mobile app. The app will be built in Flutter, Google’s cross-platform app development toolkit. We may skip some implementation details of the mobile app since that is not the focus of this tutorial.
We’ve set up Homestead to provision a domain name, api.sanctum-mobile.test, where my backend will be served, as well as a MySQL database.
First, create the Laravel app:
laravel new sanctum_mobile
At the time of writing, it gives me a new Laravel project (v8.6.0). As with the SPA tutorial, the API will provide a list of books, so We’ll create the same resources:
php artisan make:model Book -mr
The mr flags create the migration and controller too. Before we mess with the migrations, let’s first install the Sanctum package, since we’ll need its migrations again.
composer require laravel/sanctum php artisan vendor:publish --provider="Laravel\Sanctum\SanctumServiceProvider"
Now, create the books migration:
Schema::create('books', function (Blueprint $table) { $table->id(); $table->string('title'); $table->string('author'); $table->timestamps(); });
Next, run your app’s migrations:
php artisan migrate
If you now take a look in the database, you’ll see the Sanctum migration has created a personal_access_tokens table, which we’ll use later when authenticating the mobile app.
Let’s update DatabaseSeeder.php to give us some books (and a user for later):
Book::truncate(); $faker = \Faker\Factory::create(); for ($i = 0; $i < 50; $i++) { Book::create([ 'title' => $faker->sentence, 'author' => $faker->name, ]); } User::truncate(); User::create([ 'name' => 'Alex', 'email' => 'alex@alex.com', 'password' => Hash::make('pwdpwd'), ]);
Now seed the database: php artisan db:seed. Finally, create the route and the controller action. Add this to the routes/api.php file:
Route::get('book', [BookController::class, 'index']);
and then in the index method of BookController, return all the books:
return response()->json(Book::all());
After checking that the endpoint works — curl https://api.sanctum-mobile.test/api/book — it’s time to start the mobile app.
For the mobile app, we’ll be using Android Studio and Flutter. Flutter allows you to create cross-platform apps that re-use the same code for Android and iPhone devices. First, follow the instructions to install Flutter and to set up Android Studio, then launch Android Studio and click “Create a new Flutter project.”
Follow the recipe in Flutter’s cookbook to fetch data from the internet to create a page that fetches a list of books from the API. A quick and easy way to expose our API to the Android Studio device is to use Homestead’s share command:
share api.sanctum-mobile.test
The console will output an ngrok page, which will give you a URL (something like https://0c9775bd.ngrok.io) exposing your local server to the public. (An alternative to ngrok is Beyond Code’s Expose.) So let’s create a utils/constants.dart file to put that in:
const API_URL = 'http://191b43391926.ngrok.io';
Now, back to the Flutter cookbook. Create a file books.dart which will contain the classes required for our book list. First, a Book class to hold the data from the API request:
class Book { final int id; final String title; final String author; Book({this.id, this.title, this.author}); factory Book.fromJson(Map<String, dynamic> json) { return Book( id: json['id'], title: json['title'], author: json['author'], ); } }
Second, a BookList class to fetch the books and call the builder to display them:
class BookList extends StatefulWidget { @override _BookListState createState() => _BookListState(); } class _BookListState extends State<BookList> { Future<List<Book>> futureBooks; @override void initState() { super.initState(); futureBooks = fetchBooks(); } Future<List<Book>> fetchBooks() async { List<Book> books = new List<Book>(); final response = await http.get('$API_URL/api/book'); if (response.statusCode == 200) { List<dynamic> data = json.decode(response.body); for (int i = 0; i < data.length; i++) { books.add(Book.fromJson(data[i])); } return books; } else { throw Exception('Problem loading books'); } } @override Widget build(BuildContext context) { return Column( children: <Widget>[ BookListBuilder(futureBooks: futureBooks), ], ); } }
And last, a BookListBuilder to display the books:
class BookListBuilder extends StatelessWidget { const BookListBuilder({ Key key, @required this.futureBooks, }) : super(key: key); final Future<List<Book>> futureBooks; @override Widget build(BuildContext context) { return FutureBuilder<List<Book>>( future: futureBooks, builder: (context, snapshot) { if (snapshot.hasData) { return Expanded(child: ListView.builder( itemCount: snapshot.data.length, itemBuilder: (context, index) { Book book = snapshot.data[index]; return ListTile( title: Text('${book.title}'), subtitle: Text('${book.author}'), ); }, )); } else if (snapshot.hasError) { return Text("${snapshot.error}"); } return CircularProgressIndicator(); } ); } }
Now we just need to modify the MyApp class in main.dart to load the BookList:
class MyApp extends StatelessWidget { @override Widget build(BuildContext context) { return MaterialApp( title: 'Sanctum Books', home: new Scaffold( body: BookList(), ) ); } }
Now launch this in your test device or the emulator, and you should see a list of books.
Great, so we know the API is working and that we can fetch books from it. The next step is to set up authentication.
We’re going to use the provider package, and follow the guidelines in the official documentation for setting up simple state management. We want to create an authentication provider that keeps track of the logged-in status and eventually communicates with the server. Create a new file, auth.dart. Here’s where the authentication functionality will go. For the moment, we’ll return true so we can test the process works:
class AuthProvider extends ChangeNotifier { bool _isAuthenticated = false; bool get isAuthenticated => _isAuthenticated; Future<bool> login(String email, String password) async { print('logging in with email $email and password $password'); _isAuthenticated = true; notifyListeners(); return true; } }
With this provider, we can now check whether we’re authenticated and display the correct page accordingly. Modify you main function to include the provider:
void main() { runApp( ChangeNotifierProvider( create: (BuildContext context) => AuthProvider(), child: MyApp(), ) ); }
… and modify the MyApp class to show the BookList widget if we’re logged-in, or a LoginForm widget otherwise:
body: Center( child: Consumer<AuthProvider>( builder: (context, auth, child) { switch (auth.isAuthenticated) { case true: return BookList(); default: return LoginForm(); } }, ) ),
The LoginForm classes contain a lot of “widgety” cruft, so We’ll refer you to the GitHub repo if you’re interested in looking at it. Anyway, if you load the app in your test device, you should see a login form. Fill in a random email and password, submit the form, and you’ll see a list of books.
Ok, let’s set up the backend to handle the authentication. The docs tell us to create a route that will accept the username and password, as well as a device name, and return a token. So let’s create a route in the api.php file:
Route::post('token', [AuthController::class, 'requestToken']);
and a controller: php artisan make:controller AuthController. This will contain the code from the docs:
public function requestToken(Request $request): string { $request->validate([ 'email' => 'required|email', 'password' => 'required', 'device_name' => 'required', ]); $user = User::where('email', $request->email)->first(); if (! $user || ! Hash::check($request->password, $user->password)) { throw ValidationException::withMessages([ 'email' => ['The provided credentials are incorrect.'], ]); } return $user->createToken($request->device_name)->plainTextToken; }
Providing the username and password are valid, this will create a token, save it in the database, and return it to the client. To get this to work, we need to add the HasApiTokens trait to our User model. This gives us a tokens relationship, allowing us to create and fetch tokens for the user, and a createToken method. The token itself is a sha256 hash of a 40-character random string: this string (unhashed) is returned to the client, which should save it to use with any future requests to the API. More precisely, the string returned to the client is composed of the token’s id, followed by a pipe character (|), followed by the plain text (unhashed) token.
So now we have this endpoint in place, let’s update the app to use it. The login method will now have to post the email, password, and device_name to this endpoint, and if it gets a 200 response, save the token in the device’s storage. For device_name, We’re using the device_info package to get the device’s unique ID, but in fact, this string is arbitrary.
final response = await http.post('$API_URL/token', body: { 'email': email, 'password': password, 'device_name': await getDeviceId(), }, headers: { 'Accept': 'application/json', }); if (response.statusCode == 200) { String token = response.body; await saveToken(token); _isAuthenticated = true; notifyListeners(); }
We use the shared_preferences package, which allows for the storage of simple key-value pairs, to save the token:
saveToken(String token) async { final prefs = await SharedPreferences.getInstance(); await prefs.setString('token', token); }
So now we’ve got the app displaying the books page after a successful login. But of course, as things stand, the books are accessible with or without successful login. Try it out: curl https://api.sanctum-mobile.test/api/book. So now let’s protect the route:
Route:::middleware('auth:sanctum')->get('book', [BookController::class, 'index']);
Login again via the app, and this time you’ll get an error: “Problem loading books”. You are successfully authenticating, but because we don’t as yet send the API token with our request to fetch the books, the API is quite rightly not sending them. As in the previous tutorial, let’s look at the Sanctum guard to see what it’s doing here:
if ($token = $request->bearerToken()) { $model = Sanctum::$personalAccessTokenModel; $accessToken = $model::findToken($token); if (! $accessToken || ($this->expiration && $accessToken->created_at->lte(now()->subMinutes($this->expiration))) || ! $this->hasValidProvider($accessToken->tokenable)) { return; } return $this->supportsTokens($accessToken->tokenable) ? $accessToken->tokenable->withAccessToken( tap($accessToken->forceFill(['last_used_at' => now()]))->save() ) : null; }
The first condition is skipped since we aren’t using the web guard. Which leaves us with the above code. First, it only runs if the request has a “Bearer” token, i.e. if it contains an Authorization header which starts with the string “Bearer”. If it does, it will call the findToken method on the PersonalAccessToken model:
if (strpos($token, '|') === false) { return static::where('token', hash('sha256', $token))->first(); } [$id, $token] = explode('|', $token, 2); if ($instance = static::find($id)) { return hash_equals($instance->token, hash('sha256', $token)) ? $instance : null; }
The first conditional checks to see whether the pipe character is in the token and, if not, to return the first model that matches the token. We assume this is to preserve backward compatibility with versions of Sanctum before 2.3, which did not include the pipe character in the plain text token when returning it to the user. (Here is the pull request: the reason was to make the token lookup query more performant.) Anyway, assuming the pipe character is there, Sanctum grabs the model’s ID and the token itself, and checks to see if the hash matches with what is stored in the database. If it does, the model is returned.
Back in Guard: if no token is returned, or if we’re considering expiring tokens (which we’re not in this case), return null (in which case authentication fails). Finally:
return $this->supportsTokens($accessToken->tokenable) ? $accessToken->tokenable->withAccessToken( tap($accessToken->forceFill(['last_used_at' => now()]))->save() ) : null;
Check that the tokenable model (i.e., the User model) supports tokens (in other words, that it uses the HasApiTokens trait). If not, return null – authentication fails. If so, then return this:
$accessToken->tokenable->withAccessToken( tap($accessToken->forceFill(['last_used_at' => now()]))->save() )
The above example uses the single-argument version of the tap helper. This can be used to force an Eloquent method (in this case, save) to return the model itself. Here the access token model’s last_used_at timestamp is updated. The saved model is then passed as an argument to the User model’s withAccessToken method (which it gets from the HasApiTokens trait). This is a compact way of updating the token’s last_used_at timestamp and returning its associated User model. Which means authentication has been successful.
So, back to the app. With this authentication in place, we need to update the app’s call to the book endpoint to pass the token in the request’s Authorization header. To do this, update the fetchBooks method to grab the token from the Auth provider, then add it to the header:
String token = await Provider.of<AuthProvider>(context, listen: false).getToken(); final response = await http.get('$API_URL/book', headers: { 'Authorization': 'Bearer $token', });
Don’t forget to add a getToken method to the AuthProvider class:
Future<String> getToken() async { final prefs = await SharedPreferences.getInstance(); return prefs.getString('token'); }
Now try logging in again, and this time the books should be displayed.
For more information and to develop web application using Laravel, Hire Laravel Developer from us as we give you a high-quality product by utilizing all the latest tools and advanced technology. E-mail us any clock at – hello@hkinfosoft.com or Skype us: “hkinfosoft”. To develop custom web apps using React JS, please visit our technology page.
Content Source:
Laravel 8 is now released and includes many new features including Laravel Jetstream, a models directory, model factory classes, migration squashing, rate-limiting improvements, time testing helpers, dynamic blade components, and many more features.
Before we jump into the new features, we’d like to point out that starting with version 6, Laravel now follows semver and will release a new major version every six months. You can see how the release process works here.
Laravel Jetstream improves upon the existing Laravel UI scaffolding found in previous versions. It provides a starting point for new projects, including login, registration, email verification, two-factor authentication, session management, API support via Laravel, and team management.
Laravel 8’s application skeleton includes an app/Models directory. All generator commands assume models exist in app/Models; however if this directory doesn’t exist, the framework will assume the application keeps models within the app/ folder.
Eloquent model factories are now class-based starting in Laravel 8, with improved support for relationships between factories (i.e., a user has many posts). I think you’ll agree how awesome the new syntax is for generating records via the new and improved model factories:
use App\Models\User; User::factory()->count(50)->create(); // using a model state "suspended" defined within the factory class User::factory()->count(5)->suspended()->create();
If your application contains many migration files, you can now squash them into a single SQL file. This file will be executed first when running migrations, followed by any remaining migration files that are not part of the squashed schema file. Squashing existing migrations can decrease migration file bloat and possibly improve performance while running tests.
Laravel 8 brings improvements to existing rate limiting functionality while supporting backward compatibility with the existing throttle middleware and offering far more flexibility. Laravel 8 has the concept of Rate Limiters that you can define via a facade:
use Illuminate\Cache\RateLimiting\Limit; use Illuminate\Support\Facades\RateLimiter; RateLimiter::for('global', function (Request $request) { return Limit::perMinute(1000); });
As you can see, the for() method takes the HTTP request instance, giving you full control over limiting requests dynamically.
Laravel users have enjoyed full control over time modification via the excellent Carbon PHP library. Laravel 8 brings this one step further by providing convenient test helpers for manipulating the time within tests:
// Travel into the future... $this->travel(5)->milliseconds(); $this->travel(5)->seconds(); $this->travel(5)->minutes(); $this->travel(5)->hours(); $this->travel(5)->days(); $this->travel(5)->weeks(); $this->travel(5)->years(); // Travel into the past... $this->travel(-5)->hours(); // Travel to an exact time... $this->travelTo(now()->subHours(6)); // Return back to the present time... $this->travelBack();
When using these methods, the time will reset between each test.
Sometimes you need to render a blade component dynamically at runtime. Laravel 8 provides the ‘<x-dynamic-component/>’ to render the component:
<x-dynamic-component :component="$componentName" class="mt-4" />
Laravel’s job batching feature allows you to easily execute a batch of jobs and then perform some action when the batch of jobs has completed executing.
The new batch method of the Bus facade may be used to dispatch a batch of jobs. Of course, batching is primarily useful when combined with completion callbacks. So, you may use the then, catch, and finally methods to define completion callbacks for the batch. Each of these callbacks will receive an Illuminate\Bus\Batch instance when they are invoked:
use App\Jobs\ProcessPodcast; use App\Podcast; use Illuminate\Bus\Batch; use Illuminate\Support\Facades\Bus; use Throwable; $batch = Bus::batch([ new ProcessPodcast(Podcast::find(1)), new ProcessPodcast(Podcast::find(2)), new ProcessPodcast(Podcast::find(3)), new ProcessPodcast(Podcast::find(4)), new ProcessPodcast(Podcast::find(5)), ])->then(function (Batch $batch) { // All jobs completed successfully... })->catch(function (Batch $batch, Throwable $e) { // First batch job failure detected... })->finally(function (Batch $batch) { // The batch has finished executing... })->dispatch(); return $batch->id;
In previous releases of Laravel, the php artisan down maintenance mode feature may be bypassed using an “allow list” of IP addresses that were allowed to access the application. This feature has been removed in favor of a simpler “secret” / token solution.
While in maintenance mode, you may use the secret option to specify a maintenance mode bypass token:
php artisan down --secret="1630542a-246b-4b66-afa1-dd72a4c43515"
After placing the application in maintenance mode, you may navigate to the application URL matching this token and Laravel will issue a maintenance mode bypass cookie to your browser:
https://example.com/1630542a-246b-4b66-afa1-dd72a4c43515
When accessing this hidden route, you will then be redirected to the / route of the application. Once the cookie has been issued to your browser, you will be able to browse the application normally as if it was not in maintenance mode.
If you utilize the php artisan down command during deployment, your users may still occasionally encounter errors if they access the application while your Composer dependencies or other infrastructure components are updating. This occurs because a significant part of the Laravel framework must boot in order to determine your application is in maintenance mode and render the maintenance mode view using the templating engine.
For this reason, Laravel now allows you to pre-render a maintenance mode view that will be returned at the very beginning of the request cycle. This view is rendered before any of your application’s dependencies have loaded. You may pre-render a template of your choice using the down command’s render option:
php artisan down --render="errors::503"
For more information and to develop web application using Laravel, Hire Laravel Developer from us as we give you a high-quality product by utilizing all the latest tools and advanced technology. E-mail us any clock at – hello@hkinfosoft.com or Skype us: “hkinfosoft”. To develop custom web apps using React JS, please visit our technology page.
Content Source:
Nest JS and MongoDB are superior combination to create large scalable web application backend. Nest JS is progressive framework for building efficient, scalable Node.js web applications, it use express js, fastify and written in typescript. It follows Angular architecture, you can write code using Typescript and it uses decorator pattern similar like how we write API using Spring Boot or Python Flask in same manner Nest JS work. But Nest JS performance is very higher than any other server side rendered framework like Next JS or Nuxt JS due to single threaded asynchronous execution.
In this tutorial we will create a simple backed application for about page using Nest JS and MongoDB and we will Mongoose as ORM for MongoDB.
Nest JS installation is quite simple similar like Angular.
$ npm i -g @nestjs/cli $ nest new demo-app
After creating the project we will create about module, about controller and about service by giving this command
$ nest g module about $ nest g controller about $ nest g service about
This will create about module, about controller and about service. Next we need to install Nest Mongoose and Mogoose ORM for MongoDB .
$ npm install --save @nestjs/mongoose mongoose $ npm install --save-dev @types/mongoose
So we need to add Mongoose module in app.module.ts (root module) file to connect MongoDB database. So add this line in app.module.ts file to connect MongoDB. So we have created one MongoDB database in heroku and that url we have added MongooseModule.forRoot() paramater to connect MongoDB database.
import { config } from './config'; imports: [MongooseModule.forRoot('mongodb://<username>:<password>1@ds061711.mlab.com:61711/heroku_7q1xt2kd'), AboutModule,]
After installing Mongoose we will add schema for about page, for simplicity we will one property in schema and that is text. So create a folder in about folder and give name schemas (about/schemas/about.schema.ts) and then add this line in about.schema.ts file.
import * as mongoose from 'mongoose'; const Schema = mongoose.Schema; export const AboutSchema = new mongoose.Schema({ text: String });
So we import mongoose from mongoose and then create AboutSchema. After that we will add interface it defines that how our data object structure will look like. So create a folder name interfaces in about folder like in this way (about/interfaces/about.interface.ts). Now add this line in about.interface.ts file. So interface will inherit property from mongoose Document class and we have add text as string and created_at with current date by using Mongo Date Object and all property are readonly, so we can not change it.
import { Document } from 'mongoose'; export interface About extends Document { readonly text: string; readonly created_at: Date; }
So now we will add an DTO which is important defining object model. It is also useful to define swagger model, so we will add swagger property in here and later we wil add swagger script in main.ts file. Now create a folder name dto in about folder like in this format (about/dto/about.dto.ts) and then add this line in about.dto.ts file. So @ApiProperty() we are using to define the swagger object model definition and it indicates that text field will take string in swagger.
import { ApiProperty } from '@nestjs/swagger'; export class CreateAboutDTO { @ApiProperty() readonly text: string; }
After defining dto now we can create Mongoose CRUD services in about.service.ts file. So to create simple CRUD operation for MongoDB using Mongoose add those line in about.service.ts file. This are simple mongoose command to do CRUD operation which you can find in mongoose documentation. We are using “id” (MongoDB ID) parameter to update and delete document, this are some of the basic and mandatory CRUD operation that we have written but also you can write more complex queries or aggregation in services.
import { Injectable } from '@nestjs/common'; import { InjectModel } from '@nestjs/mongoose'; import { Model } from 'mongoose'; import { About } from './interfaces/about.interface'; import { CreateAboutDTO } from './dto/about.dto'; @Injectable() export class AboutService { constructor(@InjectModel('About') private AboutModel : Model<About>) {} async create(CreateAboutDTO: CreateAboutDTO): Promise<any> { const createdCat = new this.AboutModel(CreateAboutDTO); return createdCat.save(); } async findAll(): Promise<any> { return await this.AboutModel.find().exec(); } async findById(id): Promise<About> { const customer = await this.AboutModel.findById(id).exec(); return customer; } async find(req): Promise<any> { return await this.AboutModel.find(req).exec(); } async update(id, CreateAboutDTO: CreateAboutDTO): Promise<any> { return await this.AboutModel.findByIdAndUpdate(id, CreateAboutDTO, { new: true }); } async delete(id): Promise<any> { return await this.AboutModel.findByIdAndRemove(id); } }
After adding those line lets create some Rest API for about page, So create some Rest API to Save Data, update Data, Find Data and Delete data. So add this line below line in about.controller.ts file to create REST API.
import { Controller, Res, Query, Get, HttpStatus, Post, Body, Param, NotFoundException, Put, Delete } from '@nestjs/common'; import { AboutService } from './about.service'; import { UsersEmails } from '../config'; import { ApiQuery } from '@nestjs/swagger'; import { CreateAboutDTO } from './dto/about.dto'; @Controller('about') export class AboutController { constructor(private readonly AboutService: AboutService) {} @Post('/create') async addCustomer(@Res() res, @Body() CreateAboutDTO: CreateAboutDTO) { const lists = await this.AboutService.create(CreateAboutDTO); return res.status(HttpStatus.OK).json({ message: "Post has been created successfully", lists }) } @Get('all') async findAll(@Res() res) { const lists = await this.AboutService.findAll(); return res.status(HttpStatus.OK).json(lists); } @Get('id') async findById(@Res() res, @Query('id') id: string) { const lists = await this.AboutService.findById(id); if (!lists) throw new NotFoundException('Id does not exist!'); return res.status(HttpStatus.OK).json(lists); } @Put('/update') async update(@Res() res, @Query('id') id: string, @Body() CreateAboutDTO: CreateAboutDTO) { const lists = await this.AboutService.update(id, CreateAboutDTO); if (!lists) throw new NotFoundException('Id does not exist!'); return res.status(HttpStatus.OK).json({ message: 'Post has been successfully updated', lists }); } @Delete('/delete') async delete(@Res() res, @Query('id') id: string) { const lists = await this.AboutService.delete(id); if (!lists) throw new NotFoundException('Post does not exist'); return res.status(HttpStatus.OK).json({ message: 'Post has been deleted', lists }) } }
So we have created Mongoose Service and Rest API now we need to define a collection name and schema in Feature Module, So in this case it will be About Module. So add this line in import statement to create collection and add Mongoose schema in about.module.ts file.
imports: [MongooseModule.forFeature([{ name: 'About', schema: AboutSchema }])]
Now we have added four routes namely to do CRUD operation, After adding this line we are all set now we just need to configure the Swaggger API to test it. So we can add swagger by adding this line in main.ts file.
import { NestFactory } from '@nestjs/core'; import { AppModule } from './app.module'; import { SwaggerModule, DocumentBuilder } from '@nestjs/swagger'; import { config } from './config'; async function bootstrap() { const app = await NestFactory.create(AppModule); const options = new DocumentBuilder() .setTitle(config.swaggerApiTitle) .setDescription(config.swaggerApiDescription) .setVersion('1.0') .addTag(config.swaggerApiTitle) .build(); const document = SwaggerModule.createDocument(app, options); SwaggerModule.setup('api', app, document); await app.listen(3000); } bootstrap();
Now we have successfully created Nest JS MongoDB Rest api services. You can see test API using directly from Swagger by going to this url
http://localhost:3000/api/
or you can also test it in postman. I think this tutorial will helpful to understand the basic CRUD operation for Nest JS with MongoDB by using Mongooose ORM.
For more information and to develop web application using Node JS, Hire Node Developer from us as we give you a high-quality product by utilizing all the latest tools and advanced technology. E-mail us any clock at – hello@hkinfosoft.com or Skype us: “hkinfosoft“. To develop web application using Node JS, please visit our technology page.
Content Source:
Express is a web application framework for Node. It provides various features that make web-application development fast and easy, a task that otherwise takes more time when using only Node.
There are many ways you can create and manage routes using Express, the most simple and direct being through the use of the delete, get, post, and put methods. Each one of them is mapped to the equivalent request HTTP verb.
const express = require("express"); const app = express(); const port = 3000; app.get("/", (req, res) => res.send("GET")); app.post("/", (req, res) => res.send("POST")); app.get("/home", (req, res) => res.send("GET HOME")); app.post("/home", (req, res) => res.send("POST HOME")); app.get("/about", (req, res) => res.send("GET HOME")); app.post("/about", (req, res) => res.send("POST HOME")); app.listen(port, () => console.log(`App listening at http://localhost:${port}`) );
In the example above, we have three routes (/, /home, /about), each one with two different HTTP verbs (get, post) and its own logic that’s sending a unique text to the client
As you can see from this tiny example, this kind of approach can get very messy when the number of routes increase.
There’s a way we can make this code better by using the route method and method chaining to equal paths:
const express = require("express"); const app = express(); const port = 3000; app.route("/") .get((req, res) => res.send("GET")) .post((req, res) => res.send("POST")); app.route("/home") .get((req, res) => res.send("GET HOME")) .post((req, res) => res.send("POST HOME")); app.route("/about") .get((req, res) => res.send("GET HOME")) .post((req, res) => res.send("POST HOME")); app.listen(port, () => console.log(`App listening at http://localhost:${port}`) );
This way we can make the code a little bit shorter, but it’s still messy since we have multiple route paths in the same file.
Our next step is to create a separate file for each path and make use of the Express Router object. The Router object is an isolated instance of middleware and routes. You can think of it as a mini-application, capable only of performing middleware and routing functions.
// /routes/root.js const express = require("express"); const router = express.Router(); router .route("/") .get((req, res) => res.send("GET")) .post((req, res) => res.send("POST")); module.exports = router; // /routes/home.js const express = require("express"); const router = express.Router(); router .route("/") .get((req, res) => res.send("GET HOME")) .post((req, res) => res.send("POST HOME")); module.exports = router; // /routes/about.js const express = require("express"); const router = express.Router(); router .route("/") .get((req, res) => res.send("GET ABOUT")) .post((req, res) => res.send("POST ABOUT")); module.exports = router; // index.js const express = require("express"); const app = express(); const port = 3000; app.use("/", require("./routes/root")); app.use("/home", require("./routes/home")); app.use("/about", require("./routes/about")); app.listen(port, () => console.log(`App listening at http://localhost:${port}`) );
This way, we have cleaner code in the main file, index.js, and each route is defined in its own file. In addition to the Route object, we make use of the Express method use to map the path to its configuration file.
It’s much better than what we had in the earlier examples, but we can still see some problems here — like the code repetition in each route file and, mainly, the fact that every time we add new route files to the app, we need to change the main file to map the path to the file. When we have a great number of routes, we have the same problem: The main file code gets bigger and messy.
To solve the second issue, we can make a separate file to map each path to its route file and do a simple require in the main file.
// /routes/index.js module.exports = (app) => { app.use("/", require("./root")); app.use("/home", require("./home")); app.use("/about", require("./about")); }; // index.js const express = require("express"); const app = express(); const port = 3000; require("./routes")(app); app.listen(port, () => console.log(`App listening at http://localhost:${port}`) );
The routes folder index file receives the app instance from the main file and makes the path mapping. The main file makes use of the Immediately Invoked Function Expression to require and execute the routes index file passing the app created. Now we have a cleaner main file, but we still have the problem that it’s required to manually map each path to its file.
Node can help us make this better if we can make it loop through the routes folder’s files and make the mapping. To achieve this, we’ll make use of Node’s fs.readdirSync. This method is used to synchronously read the contents of a given directory. The method returns an array with all the file names or objects in the directory.
And you may ask, what about the path? If we don’t have an automated way to discover the path of each route file, we’ll still have to edit the list for each route added. We can see two kinds of solutions for this issue: to use convention over configuration or to add a export with the path in the route file.
To use convention over configuration, we’ll use the filename as the route path. To make it safer and to remove potential problematic characters, we’ll use Lodash to convert the filename to snakeCase, where strings are separated by an underscore.
Another change we’ll make is to use the Router object only in the routes config file, making the route code simpler and avoiding code repetition.
Besides this, we’ll make use the path.join Node method to make our code cross-platform, since Windows and Linux-based systems use different path names:
// /routes/root.js module.exports = (router) => { router .get("/", (req, res) => res.send("GET")) .post("/", (req, res) => res.send("POST")); return router; }; // /routes/home.js module.exports = (router) => { router .get("/", (req, res) => res.send("GET HOME")) .post("/", (req, res) => res.send("POST HOME")); return router; }; // /routes/about.js module.exports = (router) => { router .get("/", (req, res) => res.send("GET ABOUT")) .post("/", (req, res) => res.send("POST ABOUT")); return router; }; // routes/index.js const snakeCase = require("lodash/snakeCase"); const express = require("express"); module.exports = (app) => { require("fs") .readdirSync(__dirname) .forEach((file) => { if (file === "index.js") return; const path = "/" + (file !== "root.js" ? snakeCase(file.replace(".js", "")) : ""); // root.js file will map to / const router = express.Router(); const route = require(require("path").join(__dirname, file));(router); app.use(path, route); }); }; // index.js const express = require("express"); const app = express(); const port = 3000; require("./routes")(app); app.listen(port, () => console.log(`App listening at http://localhost:${port}`) );
The other option, as we said earlier, is to export the desired path name in the route file and use this information in the route config file.
To achieve this, we’ll have to export an object with the path and config keys in our route file. In this kind of configuration, we strongly suggest you use the filename path method as a fallback to avoid errors in case you forget to add the path export. This way, we have an extra bonus: a solution that’ll work on both situations.
In our final example, we’ll change the home route path to my_home and leave the about route file as it was.
// /routes/root.js module.exports = { path: "/", config: (router) => { router .get("/", (req, res) => res.send("GET")) .post("/", (req, res) => res.send("POST")); return router; }, }; // /routes/home.js module.exports = { path: "/my_home", config: (router) => { router .get("/", (req, res) => res.send("GET HOME")) .post("/", (req, res) => res.send("POST HOME")); return router; }, }; // /routes/about.js - here we dont export path to use the filename as a path module.exports = (router) => { router .get("/", (req, res) => res.send("GET ABOUT")) .post("/", (req, res) => res.send("POST ABOUT")); return router; }; // routes/index.js const snakeCase = require("lodash/snakeCase"); const express = require("express"); module.exports = (app) => { require("fs") .readdirSync(__dirname) .forEach((file) => { if (file === "index.js") return; const router = express.Router(); const routeModule = require(require("path").join(__dirname, file)); const path = routeModule.path || "/" + (file !== "root.js" ? snakeCase(file.replace(".js", "")) : ""); const route = routeModule.config ? routeModule.config(router) : routeModule(router); app.use(path, route); }); };
And that’s it. Here we have our smarter Express router config. By using this kind of configuration — and solutions like PM2 or Nodemon to watch your app folder and restart automatically on file changes – all you have to do to add a new route is to add a file in the routes folder.
Of course, there’s plenty of space for improvements, like error checking, but this kind of feature I’ll leave to you so you can get used to the code provided and practice your coding skills while trying to achieve whichever feature you think isn’t implemented in my example. To make things easier in your journey, I’ve provided the final code in CodeSandbox. Feel free to fork and work on it.
For more information and to develop web application using Node JS, Hire Node Developer from us, as we give you high quality product by utilizing all the latest tools and advanced technology. E-mail us any clock at – hello@hkinfosoft.com or Skype us: “hkinfosoft“. To develop any custom web apps using using Node JS, please visit our technology page.
Content Source:
Before we move directly to talk about 7.4 release, let’s go through each of the earlier minor releases of Laravel 7.
We will briefly look at the new features in the 7.1 release, which introduced a convenient API resource method to work with new route caching in Laravel 7.x and the ability to customize the constrained table name.
Lasse Rafn contributed an apiResource()
convenience method to work with the new Laravel 7.x caching.
Since Laravel 7.X has a new (optimized) routing implemented, route names are more important, and caching routes will break if naming collisions happen.
Normally you can use Route::name(‘posts’)->resource(…… to change the name of a group (useful for nested routes like: /posts/{post}/comments).
However, this is not possible with apiResource
.
We propose this change to allow that. It’s just a convenience to replace:
/ Before Route::name('posts') ->resource( 'posts/{post}/comments', 'PostCommentsController' ) ->only([ 'index', 'show', 'store', 'update', 'destroy' ]); // Using the apiResource() method Route::name('posts') ->apiResource( 'posts/{post}/comments', 'PostCommentsController' );
Samuel França contributed the ability to pass a table name to the constrained() method in the ForeignIdColumnDefinition class:
Here’s one example from the tests:
$blueprint ->foreignId('team_column_id') ->constrained('teams');
This version was released with HTTP client query string support and a new timeout configuration option for the SMTP mail driver. Let’s check out the new features real quick!
ShawnCZek contributed the expectsConfirmation()
method on the PendingCommand class used to test artisan commands:
$this->artisan('foo:bar') ->expectsConfirmation('Do you want to continue?', 'no') ->assertExitCode(1);
The confirmation assertion uses expectsQuestion under the hood, but asserts the actual value from the test. The same above would originally need to be:
$this->artisan('foo:bar') ->expectsConfirmation('Do you want to continue?', true) ->assertExitCode(1);
Markus Podar contributed a timeout configuration for the SMTP mail driver. The default is 30 seconds. If you want to tweak the default, add a custom timeout configuration in seconds:
'timeout' => 60, // seconds
Irfaq Syed contributed query string support to the Laravel HTTP Client, meaning you can pass a second argument to Http::get():
Here’s an example of how it works:
Http::get('https://example.com/get'); // URL: https://example.com/get Http::get('https://example.com/get?abc=123'); // URL: https://example.com/get?abc=123 Http::get('https://example.com/get', ['foo' => 'bar']); // URL: https://example.com/get?foo=bar Http::get('https://example.com/get', 'foo=bar'); // URL: https://example.com/get?foo=bar
Note that passing query params to get() overrides any present in the URI, so use one or the other.
This version was released with the ability to use ^4.0 versions of ramsey/uuid. Since the release of Laravel 7.2, a few patch releases are available that we’ll briefly cover.
Laravel 7.3 adds the possibility to use ^4.0 of ramsey/uuid, but still supports v3.7 as well. The composer dependency is now ^3.7|^4.0.
Laravel 7.2.2 fixes a few blade component issues. Notably, the make:component
command now supports sub directories:
php artisan make:component Navigation/Item # previously creates the following: # View/Components/Navigation/Item.php # views/components/item.blade.php # Now creates them as expected: # View/Components/Navigation/Item.php # views/components/navigation/item.blade.php
Laravel 7 introduced route caching speed improvements, but with that have been a few issues with apps in-the-wild. Laravel 7.2.1 fixed a route naming issue with cache; you should upgrade to the latest 7.x release to get the newest routing fixes.
It’s important to note that you should ensure the uniqueness of route names, as routes with duplicate names can “cause unexpected behavior in multiple areas of the framework.”
The latest version is released with quite a few new features, such as a custom model caster interface, a “when” higher-order collection proxy, and the ability to clear an existing “order” from the query builder.
Custom model caster interface, a “when” higher-order collection proxy, and the ability to clear an existing “order” from the query builder are the new features of Laravel 7.4.
Loris Leiva contributed the ability to use a higher-order proxy with the Collection::when()
method:
// With this PR, this: $collection->when($condition, function ($collection) use ($item) { $collection->push($item); }); // ... can be refactored as this: $collection->when($condition)->push($item);
This PR enables you to chain other higher-order proxy methods:
// This: $collection->when($condition, function ($collection) { $collection->map->parseIntoSomething(); }); // ... can be refactored as this: $collection->when($condition)->map->parseIntoSomething();
Adrian Nürnberger contributed a console test method for asking choices.
Given the following example:
$name = $this->choice('What is your name?', ['Taylor', 'Dayle'], $defaultIndex);
You could only assert the message of this question, but cannot test the given choices:
$this->artisan('question') ->expectsQuestion('What is your name?', 'Taylor') ->assertExitCode(0);
As of Laravel 7.4, you can now do the following:
$this->artisan('question') ->expectsChoice('What is your name?', 'Taylor', ['Taylor', 'Dayle']) ->assertExitCode(0);
You can also guarantee the order of choices by passing a fourth boolean argument:
$this->artisan('question') ->expectsChoice('What is your name?', 'Taylor', ['Taylor', 'Dayle'], true) ->assertExitCode(0);
@nihilsen contributed the ability to define default props via @props:
<!-- Previously you might need something like: --> @props(['type', 'message']) @php $type = $type ?? 'info' @endphp <!-- New defaults in Laravel >=7.4 --> @props(['type' => 'info', 'message'])
Brent Roose contributed a Castable interface which allows “castable” types to specify their caster classes:
// Instead of this class ModelX extends Model { protected $casts = [ 'data' => CastToDTO::class . ':' . MyDTO::class, ]; } // You could do this class ModelY extends Model { protected $casts = [ 'data' => MyDTO::class, ]; } // The underlying class use Illuminate\Contracts\Database\Eloquent\Castable; class MyDTO implements Castable { public static function castUsing() { return CastToDTO::class . ':' . static::class; } }
Jonathan Reinink contributed a reorder() method to the query builder to reset orderBy()
calls:
$query = DB::table('users')->orderBy('name'); $unorderedUsers = $query->reorder()->get();
Reorder allows you to define default order in Eloquent relationships, with the ability to back out if needed:
class Account extends Model { public function users() { return $this->hasMany(User::class)->orderBy('name'); } } // Remove the name orderBy and order by email $account->users()->reorder()->orderBy('email'); // The same can be written as: $account->users()->reorder('email');
For more information and to develop web application using Laravel, Hire Laravel Developer from us as we give you a high-quality product by utilizing all the latest tools and advanced technology. E-mail us any clock at – hello@hkinfosoft.com or Skype us: “hkinfosoft”. To develop any custom web apps using Laravel, please visit our technology page.
Content Source:
57 Sherway St,
Stoney Creek, ON
L8J 0J3
606, Suvas Scala,
S P Ring Road, Nikol,
Ahmedabad 380049
1131 Baycrest Drive,
Wesley Chapel,
FL 33544
57 Sherway St,
Stoney Creek, ON
L8J 0J3
606, Suvas Scala,
S P Ring Road, Nikol,
Ahmedabad 380049
1131 Baycrest Drive,
Wesley Chapel,
FL 33544
© 2025 — HK Infosoft. All Rights Reserved.
© 2025 — HK Infosoft. All Rights Reserved.
T&C | Privacy Policy | Sitemap