Melody MIDI accompaniment generation method based on deep neural network

A deep neural network and melody technology, applied in neural learning methods, biological neural network models, neural architectures, etc., can solve the problems of large number of parameters, low music quality, high time cost, short generation time, high music quality, The effect of less hardware resources

Active Publication Date: 2021-03-02
ZHEJIANG UNIV
View PDF14 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This work can achieve multi-track accompaniment generation, but its disadvantage is that it cannot generate music with controllable length and genre
The third type of work combines the functions of the above work, and can realize multi-track music with controllable length and genre, a typical example is MuseNet; but it requires a large amount of MIDI as a training set (order of magnitude is millions), and Due to the large amount of training parameters, the time cost of generating a specified song will be very high
At the same time, this work has no way to realize the functions of segment continuation and melody accompaniment. It can only generate music segments at a specified time from the beginning, and in the demo shown, the note density of each track is relatively sparse, and the quality of the music is low.
To sum up, if you need to make a multi-track music clip with a style controllable and any length in a short period of time, there is no ready-made solution on the market that can meet all the above requirements

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Melody MIDI accompaniment generation method based on deep neural network
  • Melody MIDI accompaniment generation method based on deep neural network
  • Melody MIDI accompaniment generation method based on deep neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] The invention discloses a method for generating a melody MIDI accompaniment based on a deep neural network, which specifically includes the following steps:

[0026] (1) Use crawlers to collect MIDI data sets with genre labels on the Internet, and classify them according to genre labels; the genres include: popular, country, jazz; channels for collecting MIDI data sets include FreeMidi website, Lakh Midi Dataset Public dataset, MidiShow website.

[0027] (2) After the MIDI data collected in step (1) is extracted through melody, track compression, data filtering, whole song segmentation, and chord recognition, the MIDI fragments are obtained, and the MIDI fragments are disrupted to obtain a data set; the specific processing process like figure 1 As shown, including the following sub-steps:

[0028] (2.1) Melody extraction: using an open source tool: Midi Miner, the function of this tool is to analyze which track in a multi-track Midi is the melody track. Use Midi Mine...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a melody MIDI accompaniment generation method based on a deep neural network, and belongs to the technical field of artificial intelligence and music science and technology. The melody MIDI accompaniment generation method comprises the steps of collecting an MIDI data set with a body tailoring label, processing the MIDI data set to obtain a data set, encoding through MuMIDIto obtain entries, inputting the entries into a GC-Transformer model for training until a total loss function converges, completing training of the GC-Transformer model, finally encoding MIDI fragments only containing melody, and inputting the MIDI fragment into a trained GC-Transformer model, and finally outputting the generated accompaniment MIDI fragment. The melody MIDI accompaniment generation method has the advantages of short generation time, high generation quality, less hardware resource consumption, less training data volume and the like.

Description

technical field [0001] The invention relates to the technical fields of artificial intelligence and music technology, in particular to a method for generating melody MIDI accompaniment based on a deep neural network. Background technique [0002] Art creation has always been considered the exclusive domain of artists. However, in recent years, with the development of deep learning, art creation has made great progress and reached unprecedented heights. For example, it can generate paintings of a specified style, or generate A musical composition that would pass the Turing test. Music generation is a huge field, and it faces many challenges, especially when users want to control many attributes of music generation (such as the number of generated instruments, music genres, etc.), the resulting music has a lot of room for improvement, This kind of task is collectively referred to as conditional controllable music generation. [0003] The current mainstream condition-controll...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G10H1/00G10H1/36G06F16/951G06N3/04G06N3/08
CPCG10H1/0025G10H1/0066G10H1/361G06F16/951G06N3/084G06N3/045
Inventor 计紫豪汪凯巍
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products