Home » Php » php – Laravel 5.2 Queue and Jobs – Not Pushing to Jobs DB

php – Laravel 5.2 Queue and Jobs – Not Pushing to Jobs DB

Posted by: admin July 12, 2020 Leave a comment



Here are the steps we are going through:

  1. Schedule is Run (creating CollectHistoricalData Jobs for each company)
  2. CollectHistoricalData should be pushed to Queue (jobs table)
  3. CollectHistoricalData has a function ApiDaemon::GetCompanyWithQuery($company, $query) which is run from a separate class that is referenced a couple of other places too.
  4. GetCompanyWithQuery collects the data and inserts it into the database.

It runs all the way through fine, but the hang up is rather than inserting the job into the jobs table, it just runs it synchronously, one after another.


The .env file is set to use the database QUEUE_DRIVER, I have even tried hard coding it in the config/queue.php file.

We are using Laravel 5.2 for a project. In this project, we are needing to every hour cURL a url and save the data to the database. We at first were using Cron Jobs and basically firing off thousands of cURLs in about a minute, which would crash PHP due to the load.

We decided to move over to Laravel’s Jobs and Queues, without success. We are using the Database driver for our jobs, and have tried numerous different approaches to getting the jobs into the database, so the daemon workers we have can process them.

Here is our code right now, we are using the Kernel.php $schedule to start the thing off, so we don’t have hundreds of requests attempting to happen an hour, which results in tens of thousands of cURLs.

Kernel.php Schedule:

        ->call(function () {
            $items = DB::select('{selecting certain things to run}');
            foreach ($items as $q) {
                $this->dispatch(new CollectHistoricalData(Company::find($q->company_id), ApiQuery::find($q->query_id)));
        ->name('Historical Pulls')
        ->before(function() {
            $this->startTime = Carbon::now();
        ->after(function () {
            mail({mail us a report afterward});

When this runs, it is sitting there running them all one by one, rather than pushing them to the Jobs table that was created.



namespace App\Jobs;

use App\Helpers\Daemons\ApiDaemon;
use App\Jobs\Job;
use App\Models\Company;
use App\Models\ApiQuery;
use Illuminate\Queue\SerializesModels;
use Illuminate\Queue\InteractsWithQueue;
use Illuminate\Contracts\Queue\ShouldQueue;

class CollectHistoricalData extends Job implements ShouldQueue
    use InteractsWithQueue, SerializesModels;

    protected $company, $query;

     * CollectHistoricalData constructor.
     * @param Company $company
     * @param ApiQuery $query
    public function __construct(Company $company, ApiQuery $query)
        $this->company = $company;
        $this->query = $query;

     * Execute the job.
     * @return void
    public function handle()
        mail({let us know what started and when});
        QueryDaemon::GetCompanyWithQuery($this->company, $this->query);

    public function failed()
        mail({mail us letting us know it failed});


The job is referencing another class with the function inside it (since that code is a heafty beast all on its own), plus there are about 20 of these, so it is easiest to reference the class rather than recreating all 20 classes into Jobs.


We have a schedule that is supposed to push a job that references a function in another class, to the jobs table, but is rather running them one after another, slowly. What is causing this?

How to&Answers:

Well… I am dumb….

php artisan config:clear

I didn’t clear the cache of the config…. wow…



php artisan config:clear

wasn’t working for me i was able to make it work using:

php artisan config:cache


In Laravel 6.x , edit this in /env : QUEUE_CONNECTION=sync => QUEUE_CONNECTION=database
then handle data can insert jobs table